1758801 Members
2993 Online
108875 Solutions
New Discussion юеВ

Tar problem

 
SOLVED
Go to solution
Deborah Grierson
Frequent Advisor

Tar problem

I want to back up a file system to tape using tar. However there is one very large file which I don't want to back up. Any ideas how I can do this easily?
I'll need all the help I can get
9 REPLIES 9
harry d brown jr
Honored Contributor

Re: Tar problem

get gnu tar:

http://hpux.cs.utah.edu/hppd/hpux/Gnu/tar-1.13.25/

or use fbackup


live free or die
harry
Live Free or Die
Joaquin Gil de Vergara
Respected Contributor

Re: Tar problem

POSIX tar is limited to 2 GB file size

use another method (fbackup or a backup software)
Teach is the best way to learn
Keely Jackson
Trusted Contributor

Re: Tar problem

Hi

Could you use 'find' to get the files you want and pipe it into tar?

Cheers
Keely
Live long and prosper
John Poff
Honored Contributor

Re: Tar problem

John,

One way to try it is to create a specific list of files to go into the tar archive. On a small scale it could work like this:

ls -l
total 10
-rw-r----- 1 jpoff users 325 Oct 1 09:51 a.txt
-rw-r----- 1 jpoff users 325 Oct 1 09:51 b.txt
-rw-r----- 1 jpoff users 325 Oct 1 09:51 c.txt
-rw-r----- 1 jpoff users 325 Oct 1 09:51 d.txt
-rw-r----- 1 jpoff users 18 Oct 1 09:52 file.list

Where file.list contains (excluding the c.txt file):

a.txt
b.txt
d.txt

Then this would work (I'm using the Korn shell):

>tar cvf jp.tar $(a a.txt 1 blocks
a b.txt 1 blocks
a d.txt 1 blocks


If you have a huge list of files this method probably won't work so good.

JP
A. Clay Stephenson
Acclaimed Contributor

Re: Tar problem

Excluding files in tar is not easy because when directories are listed, all files in that directory are recursively copied. You can use the Gnu version of tar to exclude files but if you want to use a 'standard' utility then your best bet is to use
find ! \( -name 'myfile' \) | cpio -ocv. Like 'standard' tar, cpio is limited to files of 2GB. Gnu tar is not limited to 2GB files.
If it ain't broke, I can fix that.
S.K. Chan
Honored Contributor

Re: Tar problem

2 ways, create a list (like index) of dir names or filenames .. for example file "list".
./fileA
./dirA
./dirB/fileB
then tar it up like so ..
# tar cvf /dev/rmt/0m $(cat list)
That way you can try your best to isolate that file that you do not want to back up. Also take not you can do tar append with "tar rvf" which you may find helpful. For instance the "big" file is in /test/bigfile and you want to back up the rest except this, you can tar up everything else then use tar append to all other files in /test expect that "big" file. You would still have to create the list like above.
MANOJ SRIVASTAVA
Honored Contributor

Re: Tar problem

Hi


Tar has a limitation for bigger files more than 2.0gb , you may try GNU tar which works better , ideally why dont you swtich over to fbackup stuff ?

Manoj Srivastava


ITeam
Super Advisor
Solution

Re: Tar problem

Use the find command to get list of files to be backed up, then pipe via egrep (or grep -v) to exclude chosen file(s), finally pipe the output into tar ...

This works just fine when you have loads of files in a subdirectory but have specific files, or even whole subdirectories, you want excluded.

For example, to dump contents of two data areas, but exclude the "test" sub-directories
you could use...

cd /disk1/live
find data1 data2 -print | egrep -v 'data1/test|data2/test' | tar -cvf /dev/rmt/0m ${TTY}

The above egrep needs the ' ' around the data1/test and data2/test to qualify the use of the | as an "or" operator to the exclusion.

This method is commonly used by our customers (though we actually use a hybrid version of tar that is much more efficient).
Deborah Grierson
Frequent Advisor

Re: Tar problem

All,

Thanks for your help. I went with the Alphameric solution and it worked fine.

John
I'll need all the help I can get