- Community Home
- >
- Servers and Operating Systems
- >
- Operating Systems
- >
- Operating System - HP-UX
- >
- Re: Compressing while backing up files to a DLT us...
Categories
Company
Local Language
Forums
Discussions
Knowledge Base
Forums
- Data Protection and Retention
- Entry Storage Systems
- Legacy
- Midrange and Enterprise Storage
- Storage Networking
- HPE Nimble Storage
Discussions
Forums
Discussions
Discussions
Forums
Discussions
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Community
Resources
Forums
Blogs
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-02-2002 11:52 AM
10-02-2002 11:52 AM
backup to DLT.
I have a need to backup / to a DLT(s) using
tar and would like to know the syntax to compress the files as they are backed up.
tar cvf /dev/rmt/1mn / ____________ please
fill in the blank for me.
or you can provide the same for fbackup, which
utility is quicker/better for this type of
backup?
We have a new HP-ux box and the SUrestore
autoloader is not compatible with Omniback, but
I can do a tar or fbackup until this device get
replaced with the correct type of DLT.
Thanks for you timely replies.
Andre'
Solved! Go to Solution.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-02-2002 11:56 AM
10-02-2002 11:56 AM
Re: Compressing while backing up files to a DLT using Tar
You can download it from gnu.org.
Good luck!
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-02-2002 12:13 PM
10-02-2002 12:13 PM
Solutioncd /
tar cvf - . | compress > /dev/rmt/1mn
but you would probably see better performance if you reblocked via dd.
cd /
tar cvf - . | compress | dd ibs=512 obs=256k of=/dev/rmt/1mn
Having said this, DLT's should be default compress the data unless specifically overriden and hardware compression is going to be much faster.
Moreover, fbackup is going to blow the performance of tar away. You can also specify fbackup output to stdout via -f - and pipe that to compress as in the above example.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-02-2002 12:34 PM
10-02-2002 12:34 PM
Re: Compressing while backing up files to a DLT using Tar
i was just testing your first example and got
a i/o error message. Our I.S. department has
decided they do not need a backup done until
friday night (I guess it is not important to
have this new system that they took 14 days
for a job to complete to prep it for training
until after the training is complete friday).
I will further test this tomorrow. For my
development and understanding.
Thank you for your help and wisdom.
Andre'
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-02-2002 02:56 PM
10-02-2002 02:56 PM
Re: Compressing while backing up files to a DLT using Tar
Get and install gnu tar (sometimes called gtar), PORTED to HPux:
http://hpux.cs.utah.edu/hppd/hpux/Development/Libraries/libiconv-1.8/
http://hpux.cs.utah.edu/hppd/hpux/Gnu/gettext-0.11.5/
http://hpux.cs.utah.edu/hppd/hpux/Gnu/tar-1.13.25/
It's really a great tool and does more than tar could ever do!
live free or die
harry
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-02-2002 06:34 PM
10-02-2002 06:34 PM
Re: Compressing while backing up files to a DLT using Tar
1> You can really only compress successfully once. If you attempt to compress again, the size generally grows a percent or two, since the algorithms can't find repeating patterns (etc) anymore in the data stream.
2> Given #1, I would echo Clay's comments about just leaving the compression up to the tape drive. First, it is the default for the DLT to compress, Second, it's fast (and burns no CPU cycles in your host), Third, it is so low-level, you don't even know or care about it (no complexity). The HW compression basically is useless if you've run 'compress' on the data stream first.
3> The "not supported on Omniback" is possibly a red herring for you. If the drive(s) and the robotics/picker are recognized by the OS (use ioscan to determine this), the fact that they are not on the Omniback support matrix doesn't mean they won't work, it means they haven't been tested by HP. In many cases (not all), if you configure Omniback with the device files for the drive(s) and robotics, it will go ahead and run them, provided the robot accepts commands from the default schgr/sctl driver. Try it, it very well might work. Certainly, it is no more difficult than the task you are asking about...
4> If you could run Omniback, you wouldn't be doing any of this compress/tar stuff anyway, right? You'd just let Omniback hand it to the DLT, to be compressed there. That all works just the same, with or without Omniback. I say again, let the DLT do the compression.
5> Harry and Vince both suggested GNU tar (gtar). If you can't get Omniback to work, I would suggest it as well. It does everything you are looking for, and more. [But I still say, leave the compression out in the DLT, for all the above reasons.]
HTH, Good Luck. --bmr
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-02-2002 06:54 PM
10-02-2002 06:54 PM
Re: Compressing while backing up files to a DLT using Tar
The basic issue with "tar" is that it's still stuck in the 80's. It can't handle files > 2GB.
If you want to use "tar" use GNU tar. Or do what I had to do in the early 90's: rebuild a multi-billion dollar bank's checking and savings history files (statement data) from dozens of reports because tar couldn't handle 2GB files! We of course immediately changed the way we did backups!
live free or die
harry