1745917 Members
4522 Online
108723 Solutions
New Discussion юеВ

Perl script pb

 
SOLVED
Go to solution
LE_1
Advisor

Perl script pb

Hi

I want to cut text files (when big size) in 2 and only keep the 50% last lines.

That's what I did (I'm a newbie with perl) :

#!/opt/perl/bin/perl
foreach (@ARGV) {
(-f $_ ) || next ;
($dev,$ino,$mode,$nlink,$uid,$gid,$rdev,$size,$atime,$mtime,$ctime,$blksize,$blocks) = stat();
( $size < 26000000 ) && next;
open(FIC,"$_") || die ("Error open $_ \n") ;
@tab=;
close (FIC);
rename ("$_","${_}_old") || die ("Error rename ${_}_old \n");
open(FIC2,">$_") || die ("Error open $_ \n") ;
$MILIEU=int( ($#tab+1)/2 );
for ($i = $MILIEU ; $i != $#tab+1 ; $i++) {
print FIC2 $tab[$i];
}
close(FIC2);
unlink("${_}_old") || die ("Error delete old \n");
}

My problem are :

1/ when i use it with a file of 145 Mo, I have a Out of memory with "@tab=;"
(my glance -m shows 950 Mo Free Mem)

2/ Any perf improvement is welcome.

Thanks.
8 REPLIES 8
Rodney Hills
Honored Contributor
Solution

Re: Perl script pb

You're problem is trying to read a large file totally into memory.

How about using "seek" to position the file halfway then reading the rest.

example-
open(FIC,"$_") || die ("Error open $_\n");
open(FIC2,"${_}NEW");
seek(FIC,int($size/2),0);
$dmy= ;# bypass partial record
while() {
print FIC2 $_;
}

HTH

-- Rod Hills
There be dragons...
LE_1
Advisor

Re: Perl script pb

Thanks that's better than @tab=

But can someone explain me why i have a
out of memory with 950Mo Free Mem ?
H.Merijn Brand (procura
Honored Contributor

Re: Perl script pb

per-process limit
per-user limit

What does

# limit

or in some other shells

# limits

give you? That might explain the lot

Enjoy, Have FUN! H.Merijn
Enjoy, Have FUN! H.Merijn
LE_1
Advisor

Re: Perl script pb

In ksh :
#ulimit -a
time(seconds) unlimited
file(blocks) unlimited
data(kbytes) 262144
stack(kbytes) 65536
memory(kbytes) unlimited
coredump(blocks) 4194303

and :
#exec csh
limit
cputime unlimited
filesize unlimited
datasize 262144 kbytes
stacksize 65536 kbytes
coredumpsize 2097151 kbytes
descriptors 4096 files
memoryuse unlimited
H.Merijn Brand (procura
Honored Contributor

Re: Perl script pb

See, your stack size is max 65 Mb, and your data 256 Mb. Nowhere near 950 Mb

You will have to reconfigure your kernel to enable larger process sizes, but in fact that's not what you /want/. You don't need to nor want to get anywhere close to those limits, since it will make your system swap.

Other (scripting) policies will solve your problems way better.

Enjoy, Have FUN! H.Merijn
Enjoy, Have FUN! H.Merijn
LE_1
Advisor

Re: Perl script pb

Sorry,
but i didn't understand the link
between the results of "ulimit" (256Mo) , my error "out of memory" that comes when a file size is > about 115 Mo (i did some tests).
LE_1
Advisor

Re: Perl script pb

It was this version of Perl :
Summary of my perl5 (revision 5.0 version 6 subversion 1) configuration:
Platform:
osname=hpux, osvers=11.00, archname=PA-RISC1.1-thread-multi
uname='hp-ux llbertha b.11.00 u 9000800 2002402864 unlimited-user license '
config_args='-des -Dcf_by=ActiveState -Dcf_email=ActivePerl@ActiveState.com -Uinstallusrbinperl -Ud_sigsetjmp -Dusethreads -Duseithreads -Duselargefiles -Dinc_version_list=5.6.0/$archname 5.6.0 -Dcc=gcc -Accflags=-mpa-risc-1-1 -fPIC -Dd_attribut=undef -Darchname=PA-RISC1.1 -Dcccdlflags=-fPIC -Dprefix=/opt/perl'
hint=recommended, useposix=true, d_sigaction=define
usethreads=define use5005threads=undef useithreads=define usemultiplicity=define
useperlio=undef d_sfio=undef uselargefiles=define usesocks=undef
use64bitint=undef use64bitall=undef uselongdouble=undef

I tried with perl5 (revision 5 version 8 subversion 4) and it works with big file and less Mem Free.

May be it's a bug of old release of Perl.
H.Merijn Brand (procura
Honored Contributor

Re: Perl script pb

That 5.6 is a activestate perl built for HP PA-RISC-1.1

It's a threaded build without LARGEFILE enabled

Drop it

Use the 5.8 built for HP-UX directly

Enjoy, Have FUN! H.Merijn
Enjoy, Have FUN! H.Merijn