<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Urgent !!! restore data failed!!! in Operating System - HP-UX</title>
    <link>https://community.hpe.com/t5/operating-system-hp-ux/urgent-restore-data-failed/m-p/2712490#M946946</link>
    <description>Hi, &lt;BR /&gt;I want to create a TAR file for 6GB data in dir ./work. I use the command "tar cvf work.tar ./work", &lt;BR /&gt;everything seems fine at first. because my filesystem only supports 2GB large per-file, so &lt;BR /&gt;when the work.tar reached 2GB, it ask me to add new device, then I touched a new null file named &lt;BR /&gt;work1.tar, it went on to write the data into file work1.tar. by this way, at last I get three files &lt;BR /&gt;named: work.tar, work1.tar and work2.tar.&lt;BR /&gt;when I use the command "tar xvf work.tar" to extract, 2GB data restored fine, but at the end&lt;BR /&gt; of the untar action, an error occur:&lt;BR /&gt;&lt;BR /&gt;Tar: error! blocksize changed&lt;BR /&gt;&lt;BR /&gt;then untar stoped.&lt;BR /&gt;So I can only extract the first 2GB files. I attempt some other way:&lt;BR /&gt;upgrade the filesystem to support larger then 2GB per-file, then cat the work.tar, work1.tar,&lt;BR /&gt;work2.tar to one file work_all.tar. then I extract from the work_all.tar file, no error occured &lt;BR /&gt;but I still only can extract first 2GB data.</description>
    <pubDate>Sat, 27 Apr 2002 12:12:17 GMT</pubDate>
    <dc:creator>Ocean Lee</dc:creator>
    <dc:date>2002-04-27T12:12:17Z</dc:date>
    <item>
      <title>Urgent !!! restore data failed!!!</title>
      <link>https://community.hpe.com/t5/operating-system-hp-ux/urgent-restore-data-failed/m-p/2712490#M946946</link>
      <description>Hi, &lt;BR /&gt;I want to create a TAR file for 6GB data in dir ./work. I use the command "tar cvf work.tar ./work", &lt;BR /&gt;everything seems fine at first. because my filesystem only supports 2GB large per-file, so &lt;BR /&gt;when the work.tar reached 2GB, it ask me to add new device, then I touched a new null file named &lt;BR /&gt;work1.tar, it went on to write the data into file work1.tar. by this way, at last I get three files &lt;BR /&gt;named: work.tar, work1.tar and work2.tar.&lt;BR /&gt;when I use the command "tar xvf work.tar" to extract, 2GB data restored fine, but at the end&lt;BR /&gt; of the untar action, an error occur:&lt;BR /&gt;&lt;BR /&gt;Tar: error! blocksize changed&lt;BR /&gt;&lt;BR /&gt;then untar stoped.&lt;BR /&gt;So I can only extract the first 2GB files. I attempt some other way:&lt;BR /&gt;upgrade the filesystem to support larger then 2GB per-file, then cat the work.tar, work1.tar,&lt;BR /&gt;work2.tar to one file work_all.tar. then I extract from the work_all.tar file, no error occured &lt;BR /&gt;but I still only can extract first 2GB data.</description>
      <pubDate>Sat, 27 Apr 2002 12:12:17 GMT</pubDate>
      <guid>https://community.hpe.com/t5/operating-system-hp-ux/urgent-restore-data-failed/m-p/2712490#M946946</guid>
      <dc:creator>Ocean Lee</dc:creator>
      <dc:date>2002-04-27T12:12:17Z</dc:date>
    </item>
    <item>
      <title>Re: Urgent !!! restore data failed!!!</title>
      <link>https://community.hpe.com/t5/operating-system-hp-ux/urgent-restore-data-failed/m-p/2712491#M946947</link>
      <description>&lt;BR /&gt;&lt;BR /&gt;hi &lt;BR /&gt;&lt;BR /&gt;tar will support only 2 GB size..if u want new tar which will support more athn 2 GB file size u can download it fm hpux porting site &lt;A href="http://hpux.ee.ualberta.ca/" target="_blank"&gt;http://hpux.ee.ualberta.ca/&lt;/A&gt;&lt;BR /&gt;and install it on ur server&lt;BR /&gt;&lt;BR /&gt;CTK</description>
      <pubDate>Sat, 27 Apr 2002 12:34:06 GMT</pubDate>
      <guid>https://community.hpe.com/t5/operating-system-hp-ux/urgent-restore-data-failed/m-p/2712491#M946947</guid>
      <dc:creator>Vijeesh CTK</dc:creator>
      <dc:date>2002-04-27T12:34:06Z</dc:date>
    </item>
    <item>
      <title>Re: Urgent !!! restore data failed!!!</title>
      <link>https://community.hpe.com/t5/operating-system-hp-ux/urgent-restore-data-failed/m-p/2712492#M946948</link>
      <description>Like CTK said, get the GNU TAR ported by HP:&lt;BR /&gt;&lt;BR /&gt;&lt;A href="http://hpux.cs.utah.edu/hppd/hpux/Gnu/tar-1.13.25/" target="_blank"&gt;http://hpux.cs.utah.edu/hppd/hpux/Gnu/tar-1.13.25/&lt;/A&gt;&lt;BR /&gt;&lt;BR /&gt;also add the latest gzip:&lt;BR /&gt;&lt;BR /&gt;&lt;A href="http://hpux.cs.utah.edu/hppd/hpux/Gnu/gzip-1.3.3/" target="_blank"&gt;http://hpux.cs.utah.edu/hppd/hpux/Gnu/gzip-1.3.3/&lt;/A&gt;&lt;BR /&gt;&lt;BR /&gt;and look for other utilities like "lsof"&lt;BR /&gt;&lt;BR /&gt;live free or die&lt;BR /&gt;harry</description>
      <pubDate>Sat, 27 Apr 2002 13:10:21 GMT</pubDate>
      <guid>https://community.hpe.com/t5/operating-system-hp-ux/urgent-restore-data-failed/m-p/2712492#M946948</guid>
      <dc:creator>harry d brown jr</dc:creator>
      <dc:date>2002-04-27T13:10:21Z</dc:date>
    </item>
    <item>
      <title>Re: Urgent !!! restore data failed!!!</title>
      <link>https://community.hpe.com/t5/operating-system-hp-ux/urgent-restore-data-failed/m-p/2712493#M946949</link>
      <description>Your problem maybe your shell. Piping either a large file or several 2 GB files to normal tar always worked for me when I used the plain old /usr/bin/ksh.</description>
      <pubDate>Mon, 29 Apr 2002 07:49:53 GMT</pubDate>
      <guid>https://community.hpe.com/t5/operating-system-hp-ux/urgent-restore-data-failed/m-p/2712493#M946949</guid>
      <dc:creator>Dragan Krnic_2</dc:creator>
      <dc:date>2002-04-29T07:49:53Z</dc:date>
    </item>
    <item>
      <title>Re: Urgent !!! restore data failed!!!</title>
      <link>https://community.hpe.com/t5/operating-system-hp-ux/urgent-restore-data-failed/m-p/2712494#M946950</link>
      <description>Hi there.&lt;BR /&gt;The piping idea seems the way to go.&lt;BR /&gt;We use to this during the export of Oracle databases by piping the export data into a .dmp-file through a pipe file created with mkfifo. If you ned more details, let us know.&lt;BR /&gt;Rgds&lt;BR /&gt;Alexander M. Ermes</description>
      <pubDate>Mon, 29 Apr 2002 08:07:19 GMT</pubDate>
      <guid>https://community.hpe.com/t5/operating-system-hp-ux/urgent-restore-data-failed/m-p/2712494#M946950</guid>
      <dc:creator>Alexander M. Ermes</dc:creator>
      <dc:date>2002-04-29T08:07:19Z</dc:date>
    </item>
  </channel>
</rss>

