- Community Home
- >
- Servers and Operating Systems
- >
- Operating Systems
- >
- Operating System - HP-UX
- >
- Re: yearly backups - revised
Categories
Company
Local Language
Forums
Discussions
Forums
- Data Protection and Retention
- Entry Storage Systems
- Legacy
- Midrange and Enterprise Storage
- Storage Networking
- HPE Nimble Storage
Discussions
Forums
Discussions
Discussions
Discussions
Forums
Discussions
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
- BladeSystem Infrastructure and Application Solutions
- Appliance Servers
- Alpha Servers
- BackOffice Products
- Internet Products
- HPE 9000 and HPE e3000 Servers
- Networking
- Netservers
- Secure OS Software for Linux
- Server Management (Insight Manager 7)
- Windows Server 2003
- Operating System - Tru64 Unix
- ProLiant Deployment and Provisioning
- Linux-Based Community / Regional
- Microsoft System Center Integration
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Community
Resources
Forums
Blogs
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-09-2002 01:23 PM
01-09-2002 01:23 PM
Been doing some research on the systems to be backed up and I am finding numerous files that are greater than 2GB.
The client does not want to use GNU TAR because it is not part of the default OS.
The client has also specified that the backup would be for a HP 9000 system so I do not have to worry about portibility. I am looking at fbackup/frecover for this task. Forgive me but I can even recall using fbackup.
Question - will fbackup handle files over 2GB on HPUX 10.20?
Solved! Go to Solution.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-09-2002 01:25 PM
01-09-2002 01:25 PM
Re: yearly backups - revised
Yes, fbackup is a viable alternative. If the client wants an O/S supplied utility, use CPIO instead of fbackup.
CPIO is portable to any of the systems you will have.
Good luck.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-09-2002 01:27 PM
01-09-2002 01:27 PM
Re: yearly backups - revised
cpio would also be a viable choice but I believe I still have the 2GB file limitation with cpio as well.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-09-2002 01:29 PM
01-09-2002 01:29 PM
Re: yearly backups - revised
Yes.. fbackup supports largefiles.
This doc might helpful...
http://us-support3.external.hp.com/cki/bin/doc.pl/sid=bbf5848316ee940743/screen=ckiDisplayDocument?docId=200000057843813
Goodluck,
-USA..
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-09-2002 01:32 PM
01-09-2002 01:32 PM
SolutionFbackup will easily handle large files and can be made to span tapes. It will also be much faster than tar or cpio. If you have multiple tape drives, it will use the first one then switch to the next one without intervention.
Man fbackup and frecover for details.
I am a little surprised that Gnu Tar is bad but fbackup which is certainly non-standard is okay. I would ask once again, if you need to recover data 10 years from now what will they do. With fbackup - maybe; with tar (gnu or otherwise - yes)
Oh well, there is no understanding users.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-09-2002 01:43 PM
01-09-2002 01:43 PM
Re: yearly backups - revised
'fbackup' fully supports largefiles.
I find this utility an ideal backup tool which is quite efficient. Have a look at the man pages for both 'fbackup' and 'frecover' for more information.
By creating graph files you can include and exclude files and directories as you see fit.
While you can't use wildcards in 'fbackup' graphs, you can mix (i)nclude and (e)xclude statements. For instance, you could do this:
i /tmp
e /tmp/mything
e /tmp/thistoo
You can also dynamically build your graph. Generate the list of files and directories you want (don't want) with filters, redirect it into a file and add appropriate 'i' and 'e' specifiers.
When 'fbackup' begins backing up a file (any file) it "notes" the file's modification timestamp. Once the file has been fully transferred to tape, a comparison is made to determine if the timestamp has in fact changed. If it has, then the file (on tape) is marked as "bad" and the copy attempted again.
'fbackup' will retry ' maxretries' times before skipping the file. The 'maxretries' value is defined in the 'config' file (see "man 1M fbackup"). The default is five (5). Each retry will be logged. Redirecting the 'v'erbose dialog of 'fbackup' to a file is a great way of seeing this and other messages.
The 'fbackup' configuration parameters are enumerated in the man pages for 'fbackup'. A default set is used if the 'config' file is absent, but these are non-optimal.
I don't have any magic values for you to use. I like blocksperrecord=64 or 128 with a checkpointfreq=128 or 256 accordingly. The 'blocksperrecord' parameter has generally been shown to have the biggest impact with the 'checkpointfreq' also being important. I'd rather gain speed during backup and sacrifice it during recovery, so I don't worry too much about the parameters like 'filesperfsm'. Three to five retries for busy files is more than enough.
An excellent document is #B00000196 "Understanding Fbackup/Frecover and Optimizing Backup Performance".
Regards!
...JRF...
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-09-2002 04:17 PM
01-09-2002 04:17 PM
Re: yearly backups - revised
The Unix classics are very old in computer years and should be used only to handle occasional data interchange. These tools were designed when 100 megs was a huge disk drive and could not even be lifted by one person. Using the same tool more than 20 years later to backup 5-6 orders of magnitude more data doesn't sound like a wise choice.
Be sure to use a config file as the default values for fbackup are designed for reel-to-reel tapes and you'll need to adjust them for DDS and DLT drives.
Bill Hassell, sysadmin