<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Needed: Large and Fast system backup in Operating System - Linux</title>
    <link>https://community.hpe.com/t5/operating-system-linux/needed-large-and-fast-system-backup/m-p/2905839#M78236</link>
    <description>If creating a tar/gz file is too much CPU over-head, then you've got a seriously under-powered server there.  Given that I can do this on my old PPro's without creating much of a serious dent in usage ability, I'd like to know what you're doing to this poor machine!&lt;BR /&gt;&lt;BR /&gt;Tar it's self cannot do file removals based on removed files, so you'd have to create your own routines of which say "is this file still there?" which would be even worse CPU overhead.&lt;BR /&gt;&lt;BR /&gt;For that you it's probably better you use something like 'rdist' or 'rsync' (which are traditionally used for remote-system backup of a given file-set).&lt;BR /&gt;&lt;BR /&gt;As you seem not to be concerned with creating 'backups' on the same system, one of these would be ideal.  You could then tar/compress tar of the backup structure for a decent snapshot.</description>
    <pubDate>Mon, 17 Feb 2003 04:45:48 GMT</pubDate>
    <dc:creator>Stuart Browne</dc:creator>
    <dc:date>2003-02-17T04:45:48Z</dc:date>
    <item>
      <title>Needed: Large and Fast system backup</title>
      <link>https://community.hpe.com/t5/operating-system-linux/needed-large-and-fast-system-backup/m-p/2905838#M78235</link>
      <description>Hi&lt;BR /&gt;I need to backup a linux system daily. There are many files and directories (about 2G total) and creating a tar.gz file from them everyday will take up too much CPU resources.&lt;BR /&gt;&lt;BR /&gt;I am thinking of this&lt;BR /&gt;1) Create a tar.gz from all the files to be backup&lt;BR /&gt;&lt;BR /&gt;Run a daily program that&lt;BR /&gt;1) Search for new or modified files. These files will overwrite or be appended to the tar.gz file&lt;BR /&gt;2) Search for deleted files. These files will be deleted from the tar.gz file.&lt;BR /&gt;&lt;BR /&gt;Any ideas on how to implement 1) and 2) ?&lt;BR /&gt;&lt;BR /&gt;Thanks</description>
      <pubDate>Mon, 17 Feb 2003 03:20:01 GMT</pubDate>
      <guid>https://community.hpe.com/t5/operating-system-linux/needed-large-and-fast-system-backup/m-p/2905838#M78235</guid>
      <dc:creator>kenny chia</dc:creator>
      <dc:date>2003-02-17T03:20:01Z</dc:date>
    </item>
    <item>
      <title>Re: Needed: Large and Fast system backup</title>
      <link>https://community.hpe.com/t5/operating-system-linux/needed-large-and-fast-system-backup/m-p/2905839#M78236</link>
      <description>If creating a tar/gz file is too much CPU over-head, then you've got a seriously under-powered server there.  Given that I can do this on my old PPro's without creating much of a serious dent in usage ability, I'd like to know what you're doing to this poor machine!&lt;BR /&gt;&lt;BR /&gt;Tar it's self cannot do file removals based on removed files, so you'd have to create your own routines of which say "is this file still there?" which would be even worse CPU overhead.&lt;BR /&gt;&lt;BR /&gt;For that you it's probably better you use something like 'rdist' or 'rsync' (which are traditionally used for remote-system backup of a given file-set).&lt;BR /&gt;&lt;BR /&gt;As you seem not to be concerned with creating 'backups' on the same system, one of these would be ideal.  You could then tar/compress tar of the backup structure for a decent snapshot.</description>
      <pubDate>Mon, 17 Feb 2003 04:45:48 GMT</pubDate>
      <guid>https://community.hpe.com/t5/operating-system-linux/needed-large-and-fast-system-backup/m-p/2905839#M78236</guid>
      <dc:creator>Stuart Browne</dc:creator>
      <dc:date>2003-02-17T04:45:48Z</dc:date>
    </item>
    <item>
      <title>Re: Needed: Large and Fast system backup</title>
      <link>https://community.hpe.com/t5/operating-system-linux/needed-large-and-fast-system-backup/m-p/2905840#M78237</link>
      <description>this could be done by maintaining a list of files with their md5 sums. every time generate a list of all files with their md5 sums and if there is a change between these two, then the file is modified and take a backup. &lt;BR /&gt;&lt;BR /&gt;i guess this is a crude way of taking backup and would suggest using rsync.&lt;BR /&gt;check out the link for more info.&lt;BR /&gt;&lt;A href="http://samba.org/rsync/index.html" target="_blank"&gt;http://samba.org/rsync/index.html&lt;/A&gt;&lt;BR /&gt;-balaji</description>
      <pubDate>Mon, 17 Feb 2003 06:38:27 GMT</pubDate>
      <guid>https://community.hpe.com/t5/operating-system-linux/needed-large-and-fast-system-backup/m-p/2905840#M78237</guid>
      <dc:creator>Balaji N</dc:creator>
      <dc:date>2003-02-17T06:38:27Z</dc:date>
    </item>
    <item>
      <title>Re: Needed: Large and Fast system backup</title>
      <link>https://community.hpe.com/t5/operating-system-linux/needed-large-and-fast-system-backup/m-p/2905841#M78238</link>
      <description>rsync looks like a good idea but it doesn't seem to work on tar.gz files. Anyway if I can get a large hard drive. I might implement it without tar.gz</description>
      <pubDate>Mon, 17 Feb 2003 07:30:01 GMT</pubDate>
      <guid>https://community.hpe.com/t5/operating-system-linux/needed-large-and-fast-system-backup/m-p/2905841#M78238</guid>
      <dc:creator>kenny chia</dc:creator>
      <dc:date>2003-02-17T07:30:01Z</dc:date>
    </item>
    <item>
      <title>Re: Needed: Large and Fast system backup</title>
      <link>https://community.hpe.com/t5/operating-system-linux/needed-large-and-fast-system-backup/m-p/2905842#M78239</link>
      <description>yes. that would be the best way to go. rsync directly the files instead of the .tar.gz approach on a new drive.&lt;BR /&gt;-balaji</description>
      <pubDate>Mon, 17 Feb 2003 08:16:24 GMT</pubDate>
      <guid>https://community.hpe.com/t5/operating-system-linux/needed-large-and-fast-system-backup/m-p/2905842#M78239</guid>
      <dc:creator>Balaji N</dc:creator>
      <dc:date>2003-02-17T08:16:24Z</dc:date>
    </item>
  </channel>
</rss>

