- Community Home
- >
- Servers and Operating Systems
- >
- Operating Systems
- >
- Operating System - HP-UX
- >
- Re: Perl script pb
Categories
Company
Local Language
Forums
Discussions
Forums
- Data Protection and Retention
- Entry Storage Systems
- Legacy
- Midrange and Enterprise Storage
- Storage Networking
- HPE Nimble Storage
Discussions
Discussions
Discussions
Forums
Forums
Discussions
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
- BladeSystem Infrastructure and Application Solutions
- Appliance Servers
- Alpha Servers
- BackOffice Products
- Internet Products
- HPE 9000 and HPE e3000 Servers
- Networking
- Netservers
- Secure OS Software for Linux
- Server Management (Insight Manager 7)
- Windows Server 2003
- Operating System - Tru64 Unix
- ProLiant Deployment and Provisioning
- Linux-Based Community / Regional
- Microsoft System Center Integration
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Community
Resources
Forums
Blogs
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО08-18-2004 03:26 AM
тАО08-18-2004 03:26 AM
I want to cut text files (when big size) in 2 and only keep the 50% last lines.
That's what I did (I'm a newbie with perl) :
#!/opt/perl/bin/perl
foreach (@ARGV) {
(-f $_ ) || next ;
($dev,$ino,$mode,$nlink,$uid,$gid,$rdev,$size,$atime,$mtime,$ctime,$blksize,$blocks) = stat();
( $size < 26000000 ) && next;
open(FIC,"$_") || die ("Error open $_ \n") ;
@tab=
close (FIC);
rename ("$_","${_}_old") || die ("Error rename ${_}_old \n");
open(FIC2,">$_") || die ("Error open $_ \n") ;
$MILIEU=int( ($#tab+1)/2 );
for ($i = $MILIEU ; $i != $#tab+1 ; $i++) {
print FIC2 $tab[$i];
}
close(FIC2);
unlink("${_}_old") || die ("Error delete old \n");
}
My problem are :
1/ when i use it with a file of 145 Mo, I have a Out of memory with "@tab=
(my glance -m shows 950 Mo Free Mem)
2/ Any perf improvement is welcome.
Thanks.
Solved! Go to Solution.
- Tags:
- Perl
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО08-18-2004 04:47 AM
тАО08-18-2004 04:47 AM
SolutionHow about using "seek" to position the file halfway then reading the rest.
example-
open(FIC,"$_") || die ("Error open $_\n");
open(FIC2,"${_}NEW");
seek(FIC,int($size/2),0);
$dmy=
while(
print FIC2 $_;
}
HTH
-- Rod Hills
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО08-18-2004 08:24 PM
тАО08-18-2004 08:24 PM
Re: Perl script pb
But can someone explain me why i have a
out of memory with 950Mo Free Mem ?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО08-18-2004 09:36 PM
тАО08-18-2004 09:36 PM
Re: Perl script pb
per-user limit
What does
# limit
or in some other shells
# limits
give you? That might explain the lot
Enjoy, Have FUN! H.Merijn
- Tags:
- ulimit
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО08-18-2004 10:36 PM
тАО08-18-2004 10:36 PM
Re: Perl script pb
#ulimit -a
time(seconds) unlimited
file(blocks) unlimited
data(kbytes) 262144
stack(kbytes) 65536
memory(kbytes) unlimited
coredump(blocks) 4194303
and :
#exec csh
limit
cputime unlimited
filesize unlimited
datasize 262144 kbytes
stacksize 65536 kbytes
coredumpsize 2097151 kbytes
descriptors 4096 files
memoryuse unlimited
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО08-18-2004 10:45 PM
тАО08-18-2004 10:45 PM
Re: Perl script pb
You will have to reconfigure your kernel to enable larger process sizes, but in fact that's not what you /want/. You don't need to nor want to get anywhere close to those limits, since it will make your system swap.
Other (scripting) policies will solve your problems way better.
Enjoy, Have FUN! H.Merijn
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО08-19-2004 01:00 AM
тАО08-19-2004 01:00 AM
Re: Perl script pb
but i didn't understand the link
between the results of "ulimit" (256Mo) , my error "out of memory" that comes when a file size is > about 115 Mo (i did some tests).
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО08-19-2004 01:36 AM
тАО08-19-2004 01:36 AM
Re: Perl script pb
Summary of my perl5 (revision 5.0 version 6 subversion 1) configuration:
Platform:
osname=hpux, osvers=11.00, archname=PA-RISC1.1-thread-multi
uname='hp-ux llbertha b.11.00 u 9000800 2002402864 unlimited-user license '
config_args='-des -Dcf_by=ActiveState -Dcf_email=ActivePerl@ActiveState.com -Uinstallusrbinperl -Ud_sigsetjmp -Dusethreads -Duseithreads -Duselargefiles -Dinc_version_list=5.6.0/$archname 5.6.0 -Dcc=gcc -Accflags=-mpa-risc-1-1 -fPIC -Dd_attribut=undef -Darchname=PA-RISC1.1 -Dcccdlflags=-fPIC -Dprefix=/opt/perl'
hint=recommended, useposix=true, d_sigaction=define
usethreads=define use5005threads=undef useithreads=define usemultiplicity=define
useperlio=undef d_sfio=undef uselargefiles=define usesocks=undef
use64bitint=undef use64bitall=undef uselongdouble=undef
I tried with perl5 (revision 5 version 8 subversion 4) and it works with big file and less Mem Free.
May be it's a bug of old release of Perl.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО08-19-2004 07:09 AM
тАО08-19-2004 07:09 AM
Re: Perl script pb
It's a threaded build without LARGEFILE enabled
Drop it
Use the 5.8 built for HP-UX directly
Enjoy, Have FUN! H.Merijn