- Community Home
- >
- Servers and Operating Systems
- >
- Operating Systems
- >
- Operating System - HP-UX
- >
- Tar and archives over 2 gig
Categories
Company
Local Language
Forums
Discussions
Forums
- Data Protection and Retention
- Entry Storage Systems
- Legacy
- Midrange and Enterprise Storage
- Storage Networking
- HPE Nimble Storage
Discussions
Discussions
Discussions
Forums
Forums
Discussions
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
- BladeSystem Infrastructure and Application Solutions
- Appliance Servers
- Alpha Servers
- BackOffice Products
- Internet Products
- HPE 9000 and HPE e3000 Servers
- Networking
- Netservers
- Secure OS Software for Linux
- Server Management (Insight Manager 7)
- Windows Server 2003
- Operating System - Tru64 Unix
- ProLiant Deployment and Provisioning
- Linux-Based Community / Regional
- Microsoft System Center Integration
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Community
Resources
Forums
Blogs
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО07-20-2002 04:13 AM
тАО07-20-2002 04:13 AM
Can anybody suggest maybe a better way or possibly a script that will create a new file when say a 1.8 Gig Threshold is met....
I have tried GNU Tar and it will let me make a Tar/Gzip all at once, and doesn't seem to mind going over the 2 Gig boundary. But I can't seem to make it portable enough to send out to someone without the GNU Tar... Also I am having a problem with long paths (over 100 char), GNU Tar can extract them just fine but HP's Tar doesn't....
Thanks in advance :-D
DaveAA2
Solved! Go to Solution.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО07-20-2002 04:45 AM
тАО07-20-2002 04:45 AM
Re: Tar and archives over 2 gig
Now the question is: are these files going to be used on another HP-UX system? If yes, then use fbackup which has no limitations on file size or archive size. tar (and cpio and pax) also has limitations on userID's...nothing over 60k can be recorded, so the UID is changed to the current user's UID. And tar (and cpio and pax, etc) cannot handle ACLs.
If the files are going to a non-HP-UX system, then you'll have to manually break up the files so they'll fit and use a common format.
Bill Hassell, sysadmin
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО07-20-2002 06:39 AM
тАО07-20-2002 06:39 AM
Re: Tar and archives over 2 gig
I'm really not sure what system will be used when untaring these files. So I guess I'll just have to split them up manually as I have been in the past....
Thanks again
DaveAA2
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО07-20-2002 07:15 AM
тАО07-20-2002 07:15 AM
Re: Tar and archives over 2 gig
If you run 11.11 there is a patch available for both 'tar' and for its cousin, 'pax' which enables largefiles support up to 8GB from 2GB.
For 11.11 tar - PHCO_26423
For 11.11 pax - PHCO_26422
Regards!
..JRF...
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО07-20-2002 08:48 AM
тАО07-20-2002 08:48 AM
Re: Tar and archives over 2 gig
Bill Hassell, sysadmin
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО07-20-2002 11:11 AM
тАО07-20-2002 11:11 AM
Re: Tar and archives over 2 gig
Agreed [of course] :-)
Regards!
...JRF...
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО07-20-2002 11:26 AM
тАО07-20-2002 11:26 AM
Re: Tar and archives over 2 gig
Just tar up HPux, AIX, and Solaris gtars in the first "archive" and then append yours to the end, that way the receiver can extract, using normal tar the gnu tar" they need.
live free or die
harry
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО07-21-2002 08:59 PM
тАО07-21-2002 08:59 PM
SolutionI did something a few months ago with my Oracle Database exports. Maybe you can find some hints from this.
Hope this helps
Best Regards
Yogeeraj
============================================================The 2 gig limit will apply IF you let export write directly to the file. I use
export to a PIPE and have compress and split read the pipe. The result is a
couple of 500meg compressed files that consistute the export. At 500meg, any
utility can deal with this files and I can move them around easier.
Here is the CSH script I use to show you how it is done. It does a full export
and then tests the integrity of the export by doing a full import show = y.
that gives me a file with all of my source code and ddl to boot.
#!/bin/csh -vx
setenv UID /
setenv FN exp.`date +%j_%Y`.dmp
setenv PIPE /tmp/exp_tmp_ora8i.dmp
setenv MAXSIZE 500m
setenv EXPORT_WHAT "full=y COMPRESS=n"
echo $FN
cd /nfs/atc-netapp1/expbkup_ora8i
ls -l
rm expbkup.log export.test exp.*.dmp* $PIPE
mknod $PIPE p
date > expbkup.log
( gzip < $PIPE ) | split -b $MAXSIZE - $FN. &
#split -b $MAXSIZE $PIPE $FN. &
exp userid=$UID buffer=20000000 file=$PIPE $EXPORT_WHAT >>& expbkup.log
date >> expbkup.log
date > export.test
cat `echo $FN.* | sort` | gunzip > $PIPE &
#cat `echo $FN.* | sort` > $PIPE &
imp userid=sys/o8isgr8 file=$PIPE show=y full=y >>& export.test
date >> export.test
tail expbkup.log
tail export.test
ls -l
rm -f $PIPE
------------ eof -------------------------
============================================================
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО07-23-2002 06:23 AM
тАО07-23-2002 06:23 AM
Re: Tar and archives over 2 gig
Well I'm not much of a programmer but I was able to come up with something that works very well for my TAR dilemma. I used the split command as you used in your script with tar and gzip. Volla, I am now able to split the files into 1.5Gig chunks then gzip them as shown below:
tar cvf - `cat /local/tarlist.dat` | split -b 1500m - /FILENAME.tar.
The above command produces files such as the ones below
FILENAME.tar.aa
FILENAME.tar.ab
FILENAME.tar.ac
and so on
I then gzip all the files produced to make them small
FILENAME.tar.aa.gz
FILENAME.tar.ab.gz
FILENAME.tar.ac.gz
and so on
Now the second part of the magic is on the customer???s end, which I will provide them with a small script file to untar the mess.
gzcat `echo FILENAME.tar.* | sort` | tar tvf -
And that???s all she wrote...
Thanks again to all who replied I got other very useful info from you also.
DaveAA2 :-))))