Operating System - HP-UX
1753816 Members
8643 Online
108805 Solutions
New Discussion юеВ

Tar and archives over 2 gig

 
SOLVED
Go to solution

Tar and archives over 2 gig

I am faced with the challenge of archiving approx. 9 Gig's onto CD's for a customer. I am using HPUX 11, Tar and Gzip. The problem I am having is anything over 2 Gig and tar has a nervous break down.

Can anybody suggest maybe a better way or possibly a script that will create a new file when say a 1.8 Gig Threshold is met....

I have tried GNU Tar and it will let me make a Tar/Gzip all at once, and doesn't seem to mind going over the 2 Gig boundary. But I can't seem to make it portable enough to send out to someone without the GNU Tar... Also I am having a problem with long paths (over 100 char), GNU Tar can extract them just fine but HP's Tar doesn't....

Thanks in advance :-D

DaveAA2
Whats it there for if you can't use it
8 REPLIES 8
Bill Hassell
Honored Contributor

Re: Tar and archives over 2 gig

This is the problem working with 20 year old software designed for a few dozen megs. tar is an industry standard and the headers that identify the size of the file can hold a 2Gig number. If HP were to make the header larger, then no other flavor of standard tar could read it. GnuTAR (if you compile the large file version) does have an incompatible header for large files but as you've seen, GnuTAR must exist on the target system in order to read it.

Now the question is: are these files going to be used on another HP-UX system? If yes, then use fbackup which has no limitations on file size or archive size. tar (and cpio and pax) also has limitations on userID's...nothing over 60k can be recorded, so the UID is changed to the current user's UID. And tar (and cpio and pax, etc) cannot handle ACLs.

If the files are going to a non-HP-UX system, then you'll have to manually break up the files so they'll fit and use a common format.


Bill Hassell, sysadmin

Re: Tar and archives over 2 gig

Bill,

I'm really not sure what system will be used when untaring these files. So I guess I'll just have to split them up manually as I have been in the past....

Thanks again

DaveAA2
Whats it there for if you can't use it
James R. Ferguson
Acclaimed Contributor

Re: Tar and archives over 2 gig

Hi Dave:

If you run 11.11 there is a patch available for both 'tar' and for its cousin, 'pax' which enables largefiles support up to 8GB from 2GB.

For 11.11 tar - PHCO_26423
For 11.11 pax - PHCO_26422

Regards!

..JRF...
Bill Hassell
Honored Contributor

Re: Tar and archives over 2 gig

Correct, but since the target system is unknown (if HP-UX, it would also have to be patched), the 'new' tar would as compatible as GnuTAR.


Bill Hassell, sysadmin
James R. Ferguson
Acclaimed Contributor

Re: Tar and archives over 2 gig

Hi Bill:

Agreed [of course] :-)

Regards!

...JRF...
harry d brown jr
Honored Contributor

Re: Tar and archives over 2 gig

dave,

Just tar up HPux, AIX, and Solaris gtars in the first "archive" and then append yours to the end, that way the receiver can extract, using normal tar the gnu tar" they need.

live free or die
harry
Live Free or Die
Yogeeraj_1
Honored Contributor
Solution

Re: Tar and archives over 2 gig

Hello,

I did something a few months ago with my Oracle Database exports. Maybe you can find some hints from this.

Hope this helps

Best Regards
Yogeeraj


============================================================The 2 gig limit will apply IF you let export write directly to the file. I use
export to a PIPE and have compress and split read the pipe. The result is a
couple of 500meg compressed files that consistute the export. At 500meg, any
utility can deal with this files and I can move them around easier.

Here is the CSH script I use to show you how it is done. It does a full export
and then tests the integrity of the export by doing a full import show = y.
that gives me a file with all of my source code and ddl to boot.

#!/bin/csh -vx

setenv UID /
setenv FN exp.`date +%j_%Y`.dmp
setenv PIPE /tmp/exp_tmp_ora8i.dmp

setenv MAXSIZE 500m
setenv EXPORT_WHAT "full=y COMPRESS=n"

echo $FN

cd /nfs/atc-netapp1/expbkup_ora8i
ls -l

rm expbkup.log export.test exp.*.dmp* $PIPE
mknod $PIPE p

date > expbkup.log
( gzip < $PIPE ) | split -b $MAXSIZE - $FN. &
#split -b $MAXSIZE $PIPE $FN. &

exp userid=$UID buffer=20000000 file=$PIPE $EXPORT_WHAT >>& expbkup.log
date >> expbkup.log


date > export.test
cat `echo $FN.* | sort` | gunzip > $PIPE &
#cat `echo $FN.* | sort` > $PIPE &
imp userid=sys/o8isgr8 file=$PIPE show=y full=y >>& export.test
date >> export.test

tail expbkup.log
tail export.test

ls -l
rm -f $PIPE

------------ eof -------------------------
============================================================
No person was ever honoured for what he received. Honour has been the reward for what he gave (clavin coolidge)

Re: Tar and archives over 2 gig

yogeeraj,

Well I'm not much of a programmer but I was able to come up with something that works very well for my TAR dilemma. I used the split command as you used in your script with tar and gzip. Volla, I am now able to split the files into 1.5Gig chunks then gzip them as shown below:

tar cvf - `cat /local/tarlist.dat` | split -b 1500m - /FILENAME.tar.

The above command produces files such as the ones below

FILENAME.tar.aa
FILENAME.tar.ab
FILENAME.tar.ac
and so on

I then gzip all the files produced to make them small

FILENAME.tar.aa.gz
FILENAME.tar.ab.gz
FILENAME.tar.ac.gz
and so on

Now the second part of the magic is on the customer???s end, which I will provide them with a small script file to untar the mess.

gzcat `echo FILENAME.tar.* | sort` | tar tvf -

And that???s all she wrote...
Thanks again to all who replied I got other very useful info from you also.
DaveAA2 :-))))
Whats it there for if you can't use it