- Community Home
- >
- Servers and Operating Systems
- >
- Operating Systems
- >
- Operating System - Linux
- >
- Needed: Large and Fast system backup
Categories
Company
Local Language
Forums
Discussions
Forums
- Data Protection and Retention
- Entry Storage Systems
- Legacy
- Midrange and Enterprise Storage
- Storage Networking
- HPE Nimble Storage
Discussions
Forums
Discussions
Discussions
Discussions
Forums
Forums
Discussions
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
- BladeSystem Infrastructure and Application Solutions
- Appliance Servers
- Alpha Servers
- BackOffice Products
- Internet Products
- HPE 9000 and HPE e3000 Servers
- Networking
- Netservers
- Secure OS Software for Linux
- Server Management (Insight Manager 7)
- Windows Server 2003
- Operating System - Tru64 Unix
- ProLiant Deployment and Provisioning
- Linux-Based Community / Regional
- Microsoft System Center Integration
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Community
Resources
Forums
Blogs
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО02-16-2003 07:20 PM
тАО02-16-2003 07:20 PM
Needed: Large and Fast system backup
I need to backup a linux system daily. There are many files and directories (about 2G total) and creating a tar.gz file from them everyday will take up too much CPU resources.
I am thinking of this
1) Create a tar.gz from all the files to be backup
Run a daily program that
1) Search for new or modified files. These files will overwrite or be appended to the tar.gz file
2) Search for deleted files. These files will be deleted from the tar.gz file.
Any ideas on how to implement 1) and 2) ?
Thanks
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО02-16-2003 08:45 PM
тАО02-16-2003 08:45 PM
Re: Needed: Large and Fast system backup
Tar it's self cannot do file removals based on removed files, so you'd have to create your own routines of which say "is this file still there?" which would be even worse CPU overhead.
For that you it's probably better you use something like 'rdist' or 'rsync' (which are traditionally used for remote-system backup of a given file-set).
As you seem not to be concerned with creating 'backups' on the same system, one of these would be ideal. You could then tar/compress tar of the backup structure for a decent snapshot.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО02-16-2003 10:38 PM
тАО02-16-2003 10:38 PM
Re: Needed: Large and Fast system backup
i guess this is a crude way of taking backup and would suggest using rsync.
check out the link for more info.
http://samba.org/rsync/index.html
-balaji
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО02-16-2003 11:30 PM
тАО02-16-2003 11:30 PM
Re: Needed: Large and Fast system backup
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО02-17-2003 12:16 AM
тАО02-17-2003 12:16 AM
Re: Needed: Large and Fast system backup
-balaji