Operating System - HP-UX
1783198 Members
1804 Online
109144 Solutions
New Discussion юеВ

Maximum File size for Unix 9.x

 
Jeremy Foland
Advisor

Maximum File size for Unix 9.x

I found a posting stating the maximum file size for Unix 10.X but not 9.X

http://forums.itrc.hp.com/cm/QuestionAnswer/1,,0x66c3854994d9d4118fef0090279cd0f9,00.html

Can anyone tell me the max file size for 9.X?

I'm creating a huge tar file, but it seems to stop accepting new files once the size hits 169256960 bytes. Could it also be there are too many files in the tar?

Thanks.

Jeremy Foland
14 REPLIES 14
Sridhar Bhaskarla
Honored Contributor

Re: Maximum File size for Unix 9.x

Hi Jeremy,

If I recall correctly, it is 2 GB.

What is the error message you are getting?.

-Sri
You may be disappointed if you fail, but you are doomed if you don't try
A. Clay Stephenson
Acclaimed Contributor

Re: Maximum File size for Unix 9.x

You are actually hitting 2 limits. The maximum size of any file under 9.x is 2GB. The 128MB limit didn't come along until 10.20 but also tar will not backup any file larger than 2GB. You can get around the 2GB tar output limit by writing to a device node (e.g. /dev/rmt/0m) but that will not get you by the 2GB individual file limit.
If it ain't broke, I can fix that.
Jeff Schussele
Honored Contributor

Re: Maximum File size for Unix 9.x

Hi Jeremy,

You should verify your ulimit.
Run
ulimit -a
and verify the max block size for file. Remember blocks are 512 byte in size.

Rgds,
Jeff
PERSEVERANCE -- Remember, whatever does not kill you only makes you stronger!
MANOJ SRIVASTAVA
Honored Contributor

Re: Maximum File size for Unix 9.x

the size is 2.0gb but u are not hitting that at all , I would check the free space and the ulimit for the filesystem in terms of soft and har limits .


Manoj Srivastava
Jeremy Foland
Advisor

Re: Maximum File size for Unix 9.x

I did the ulimit and it output 4194303 which translates to 2.15GB. So that must not be the problem.

I have a TAR file that is 169MB and I'm trying to

tar rvf TARFILE ADDFILE

the addfile is pretty small

(Actually it's a script that rvf's several files to the archive)

but it just keeps grinding away, and the new file isn't added to the tar archive. I never get an error message. Any thoughts on what's up?

-Jeremy
Jeff Schussele
Honored Contributor

Re: Maximum File size for Unix 9.x

Try a
tar tvf TARFILE
to verify the integrity of the exixting file - it may be corrupt.
Also if it was created with a specific blocking factor (-b) then you must use that factor again.

Rgds,
Jeff
PERSEVERANCE -- Remember, whatever does not kill you only makes you stronger!
Jeremy Foland
Advisor

Re: Maximum File size for Unix 9.x

Tar TVF works fine.

I created the TAR with the default Blocking factor.

Actually, the TAR is still running now (6 hours after it began). The file size is increasing, but very slowly.

The Script structure is:

tar cvf TARFILE Addfile_A
tar rvf TARFILE Addfile_B
tar rvf TARFILE Addfile_C
tar rvf TARFILE Addfile_D
.
.
.
tar rvf TARFILE Addfile_ZZ

After each successive TAR command, does the system have to "read" to the end of the TARFILE to "find" the end to append the next file, so as the TARFILE grows, each successive TAR gets slower? (In the world of Semiconductor Manufacturing, we call it Diffusion Limited)

-Jeremy Foland
Jeff Schussele
Honored Contributor

Re: Maximum File size for Unix 9.x

Yes, a tar file is a sequential file. It will read the entire file each time to find the end.

Is there a reason why you cannot "gang up" the additional files & use one tar rvf command?
i.e. tar rvf TARFILE file1 file2 file3......
Even if the tar rvf is dependent on other actions, you could "build" a variable called $FILES containing (file1 file2 file3 etc.)& then use
tar rvf TARFILE $FILES

HTH,
Jeff
PERSEVERANCE -- Remember, whatever does not kill you only makes you stronger!
Patrick Wallek
Honored Contributor

Re: Maximum File size for Unix 9.x

Instead of using 'tar -rvf', why not just do multiple 'tar -cvf' and use the no-rewind device?

So from the beginning:

tar -cvf /dev/rmt/0mnb first_stuff
tar -cvf /dev/rmt/0mnb more_stuff
tar -cvf /dev/rmt/0mnb even_more_stuff

This way once the first tar is done the tape will stay at the end of the first tar "file" and then you can start the next one at that spot.
Jeff Schussele
Honored Contributor

Re: Maximum File size for Unix 9.x

Hi Patrick,

Look up the thread a little, he's building a tar file - he's not writing out to tape.
Although that's the right solution for tape archive.

Still it's interesting though, the times he's incurring. I'm wondering if he has a problem with tar in & of itself. Could be a patch issue - but where he's going to get 9.x patches is an issue as well.

I'm wondering if GNU tar for 10.2 would work for him. It would be worth a try I'd think.

Jeremy - If you'd like to try the 10.2 GNU tar, you can get it here:

http://hpux.cs.utah.edu/hppd/hpux/Gnu/tar-1.13.25/

Rgds,
Jeff
PERSEVERANCE -- Remember, whatever does not kill you only makes you stronger!
Patrick Wallek
Honored Contributor

Re: Maximum File size for Unix 9.x

Hey Jeff,

DOH!!!!! I hate it when I do that. (See -- even Pharoahs screw up occasionally!)

Hmmm......Could it be a memory issue? Since this is a 9.X machine, I'm going to assume that it doesn't have much RAM or a very fast CPU. Have you checked while the tar is going to see if maybe you have started swapping? (swapinfo commnad).

If you are running out of memory, and reading and writing lots of files, buffer cache may not be behaving well and you may be experienceing memory pressure.

Jeremy Foland
Advisor

Re: Maximum File size for Unix 9.x

I like the idea of "Ganging up" the files together. I hadn't thought of that. Will that prevent the system from reading to the end of the file each time?

Could you elaborate on exactly how I'd put the file names into a $Files File?

Thanks for the Help.

-Jeremy Foland
Jeff Schussele
Honored Contributor

Re: Maximum File size for Unix 9.x

Yes, it will only search to the end one time & then begin appending all the designated files to the tar file.

I've no idea just how you designate the files to append, but it could be as simple as:
FILES="/path/to/file1 /path/to/file2 /path/to/file3"

OR a series of adds to FILES
FILES=file1
FILES="$FILES file2"
FILES="$FILES file3"

Either way $FILES will end up with the 3 files designated.
NOTE - since whitespace is needed you must enclose the values in quotes.

Rgds,
Jeff
PERSEVERANCE -- Remember, whatever does not kill you only makes you stronger!
Bill Hassell
Honored Contributor

Re: Maximum File size for Unix 9.x

Here's a possibility: if the sizes of all the files are small based on ll, are any of the files sparse? That is, the application that created them wrote randomly inside the file and there are a lot of holes. When you serially read a file, there is no way to determine whether missing (undefined) records exist--the opsystem automatically fills in the missing records with zeros which makes the copy much larger than the original file. So while there may be just a few hundred megs in the files, try a simple copy to another directory and see if the sizes are still the same. If not, there are sparse files and tar is hitting the 2Gb limit.


Bill Hassell, sysadmin