- Community Home
- >
- Servers and Operating Systems
- >
- Operating Systems
- >
- Operating System - HP-UX
- >
- Hundreds of thousands of small (5-10k) files - imp...
Categories
Company
Local Language
Forums
Discussions
Forums
- Data Protection and Retention
- Entry Storage Systems
- Legacy
- Midrange and Enterprise Storage
- Storage Networking
- HPE Nimble Storage
Discussions
Forums
Discussions
Discussions
Forums
Discussions
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
- BladeSystem Infrastructure and Application Solutions
- Appliance Servers
- Alpha Servers
- BackOffice Products
- Internet Products
- HPE 9000 and HPE e3000 Servers
- Networking
- Netservers
- Secure OS Software for Linux
- Server Management (Insight Manager 7)
- Windows Server 2003
- Operating System - Tru64 Unix
- ProLiant Deployment and Provisioning
- Linux-Based Community / Regional
- Microsoft System Center Integration
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Community
Resources
Forums
Blogs
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-29-2004 02:19 AM
10-29-2004 02:19 AM
We are using HP OpenView Storage Data Protector version A.05.10 on a HP-UX server.
Can someone give me any suggestions on how they backup there similar systems to help us improve our backup times?
Solved! Go to Solution.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-29-2004 02:23 AM
10-29-2004 02:23 AM
Re: Hundreds of thousands of small (5-10k) files - improve backup time
If so then how about on a daily basis, tar and compress all files for that day into one archive. That way you've only got one file per day to backup.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-29-2004 02:27 AM
10-29-2004 02:27 AM
Re: Hundreds of thousands of small (5-10k) files - improve backup time
Files which are not used, not modified for n-days, just keep them archived, usinf tar in a direcotry called TAR, or something of your choice. Back them up seperately. And in original directory just backup the new files..
find . -mtime +n -type ...........
I hope this can be best to implement..
Hope this helps
Thanks
Prashant
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-29-2004 04:39 AM
10-29-2004 04:39 AM
Re: Hundreds of thousands of small (5-10k) files - improve backup time
The best approach is to literally not care how long the backups take. One approach is to use vxfs snapof mounts to create snapshots of each filesystem. This takes only seconds per filesystem. Normal operations can now resume and you then backup the snapshots at your convenience. Speed is now no longer of much concern. When the backup is finished, you unmount the snapshots.
You can improve your backups somewhat by changing from "Log All" to "Log Directory"; this reduces the number of DP database updates at the expense of more tedious restores.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-29-2004 04:44 AM
10-29-2004 04:44 AM
Re: Hundreds of thousands of small (5-10k) files - improve backup time
if you can unmount the filesystem, where the files resides, for a some time, you can do a backup of raw volume for improving backup performance.
Regards,
Zygmunt
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-29-2004 06:10 AM
10-29-2004 06:10 AM
SolutionThe only real way that you can improve the performance for this filesystem is to improve the filesystem performance. Change to a faster FS and get off RAID 5 - if you're using it at all. Otherwise, just look at how long it takes to do an ls -R on this filesystem. If you have directories with loads of entries, it'll take the backup software the same amount of time to get those entries and check whether it needs to back them up (in the case of an incremental). Also, this will result in your drive shoeshining rather than streaming, so you've got a bunch of things ganging up to make sure you're not going to get performance from this backup.
Snapshot is the best idea to allow you to get back individual files. Depending what kind of transactions are taking place here, you may want to take a snapshot every hour to enable you to get files back immediately from disk, then delete those after your nightly backup. This will probably save you from going to tape except in the event of either a real disaster or the "I don't know when I last saw the file" user...