1849592 Members
6412 Online
104044 Solutions
New Discussion

Re: file size

 
SOLVED
Go to solution
P. Prinsloo
Advisor

file size

I am using awk to read a file, do some processing and then write to an output file with the 'print field1, field2, ... , fieldn > outfile'. The input file have more than 9 million records, but only more than 7 million records is written to the output file instead of the more than 9 million.

What happens is that as soon as the outfile size reach 2147483136 bytes, it refuse to grow and sommer stops in the middle of an record. I know the maximum displayable digit in Unix scripting is 2147483647 and if one add a 512 byte block size to the outfile size it will exceed that maximum value.

Is this file size a limitation with awk?
Wally
9 REPLIES 9
Tom Geudens
Honored Contributor

Re: file size

Hi,
Probably your filesystem does not support "largefiles" (files >2Gb). If you have OnlineJFS you can activate this with :
#fsadm -o largefiles -F vxfs /your_filesystem

Search the forums with "largefiles" for more information ...

Regards,
Tom
A life ? Cool ! Where can I download one of those from ?
harry d brown jr
Honored Contributor

Re: file size


DO you have LARGE files enabled on your Filesystem??

You should consider perl over awk.

live free or die
harry
Live Free or Die
Cheryl Griffin
Honored Contributor

Re: file size

According to the largefiles white paper /usr/share/doc/lg_files.txt awk can handle largefiles but the pattern file will remain small. Here's the reference:

5.2.6 Text Processing Commands
Below is a list of handy commands that may be used to process files that are large. Each will appropriately handle large data files. However, pattern files for commands such as awk and sed will continue to remain small. There are no new options in this set of
commands.

awk
...

Cheryl
"Downtime is a Crime."
Chris Wilshaw
Honored Contributor
Solution

Re: file size

It sounds like the filesystem was created without the -olargefiles option.

You can check this by running

mkfs -m /dev/vgXX/lvolYY

where XX and YY are replaced by the numers relevant to your filesystem

eg from one of my systems:

mkfs -m /dev/vg05/lvol13
mkfs -F vxfs -o ninode=unlimited,bsize=1024,version=3,inosize=256,logsize=1024,largefiles /dev/vg05/lvol13 716800

As you can see, this lists largefiles in the options.
James R. Ferguson
Acclaimed Contributor

Re: file size

Hi:

The filesystem in which the output file is being built does not support largefiles. The size you cite (2.14748E+09) is the 2GB limit. You can easily enable largefiles for the filesystem:

# fsadm -F vxfs -o largefiles /dev/vgXX/rlvolX

...note the raw logical volume.

Regards!

...JRF...
P. Prinsloo
Advisor

Re: file size

I have checked the file system and it has been created with the largefiles option.
I've also been looking at the kernel and have the following settings
maxdsiz 163840
maxdsiz_64bit 262144
maxfiles 2048
maxfiles_lim 2048
maxssiz 7812
maxssiz_64bit 4096

My system is a N4000 with 6Gig memory and 8 CPUs


Wally
Sandip Ghosh
Honored Contributor

Re: file size

I think, your maxdsiz and maxssiz are in the lower end. Try to increase it. In our system maxdsiz=67108864 and maxssiz=8388608.

Hope this helps.

Sandip
Good Luck!!!
A. Clay Stephenson
Acclaimed Contributor

Re: file size

I don't ever remember writing a 2GB file with awk but I have done it with Perl. It would be very helpful to know the value of errno. When awk dies, you should immediately do an echo ${?} and note the result. You might try downloading and installing the Gnu versiopn of awk (gawk) from any of the HP-UX Porting Centre's.

Let'a also not overlook the obvious: ulimit and quotas. I very much doubt that maxdsix and maxssix has anything to do with this.
If it ain't broke, I can fix that.
Sanjay_6
Honored Contributor

Re: file size

Hi,

Looks like you need to enable "largefiles" for the lv in question. This will allow creation of files larger than 2GB on that filesystem.

http://support1.itrc.hp.com/service/cki/docDisplay.do?docLocale=en_US&docId=200000062683990

The Doc id is KBAN00000105

Hope this helps.

Regds