<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic compressing large files in Operating System - HP-UX</title>
    <link>https://community.hpe.com/t5/operating-system-hp-ux/compressing-large-files/m-p/3279871#M180239</link>
    <description>Hp-UX gurus,&lt;BR /&gt;     i have question. do you forsee or  have you experienced any problems with using large files (especially oracle data files) of sizes 60gb+?&lt;BR /&gt;&lt;BR /&gt;how much free space do we need to compress a file of size 60gb so that the compression proceeds without running out of disk space?&lt;BR /&gt;i appreciate your response.&lt;BR /&gt;&lt;BR /&gt;thanks&lt;BR /&gt;Mukundan</description>
    <pubDate>Tue, 18 May 2004 13:28:00 GMT</pubDate>
    <dc:creator>Mukundan_1</dc:creator>
    <dc:date>2004-05-18T13:28:00Z</dc:date>
    <item>
      <title>compressing large files</title>
      <link>https://community.hpe.com/t5/operating-system-hp-ux/compressing-large-files/m-p/3279871#M180239</link>
      <description>Hp-UX gurus,&lt;BR /&gt;     i have question. do you forsee or  have you experienced any problems with using large files (especially oracle data files) of sizes 60gb+?&lt;BR /&gt;&lt;BR /&gt;how much free space do we need to compress a file of size 60gb so that the compression proceeds without running out of disk space?&lt;BR /&gt;i appreciate your response.&lt;BR /&gt;&lt;BR /&gt;thanks&lt;BR /&gt;Mukundan</description>
      <pubDate>Tue, 18 May 2004 13:28:00 GMT</pubDate>
      <guid>https://community.hpe.com/t5/operating-system-hp-ux/compressing-large-files/m-p/3279871#M180239</guid>
      <dc:creator>Mukundan_1</dc:creator>
      <dc:date>2004-05-18T13:28:00Z</dc:date>
    </item>
    <item>
      <title>Re: compressing large files</title>
      <link>https://community.hpe.com/t5/operating-system-hp-ux/compressing-large-files/m-p/3279872#M180240</link>
      <description>I never used files so large but the command compress needs free space equal to the file size, when it gets this size for the compressed file it stops&lt;BR /&gt;Using gzip should be better but i don't know how much it needs&lt;BR /&gt;&lt;BR /&gt;Hope this help&lt;BR /&gt;Cesare</description>
      <pubDate>Tue, 18 May 2004 13:32:15 GMT</pubDate>
      <guid>https://community.hpe.com/t5/operating-system-hp-ux/compressing-large-files/m-p/3279872#M180240</guid>
      <dc:creator>Cesare Salvioni</dc:creator>
      <dc:date>2004-05-18T13:32:15Z</dc:date>
    </item>
    <item>
      <title>Re: compressing large files</title>
      <link>https://community.hpe.com/t5/operating-system-hp-ux/compressing-large-files/m-p/3279873#M180241</link>
      <description>Hi&lt;BR /&gt;&lt;BR /&gt;During the compression process of very large files disk space is used as a buffer/build area so as a general rule twice the file size is required.&lt;BR /&gt;&lt;BR /&gt;Paula&lt;BR /&gt;&lt;BR /&gt;</description>
      <pubDate>Tue, 18 May 2004 13:53:07 GMT</pubDate>
      <guid>https://community.hpe.com/t5/operating-system-hp-ux/compressing-large-files/m-p/3279873#M180241</guid>
      <dc:creator>Paula J Frazer-Campbell</dc:creator>
      <dc:date>2004-05-18T13:53:07Z</dc:date>
    </item>
    <item>
      <title>Re: compressing large files</title>
      <link>https://community.hpe.com/t5/operating-system-hp-ux/compressing-large-files/m-p/3279874#M180242</link>
      <description>That depends on how good the file is compressable.&lt;BR /&gt;&lt;BR /&gt;During compress or gzip "file.Z" or "file.gz" growes and as soon the compress is finished "file" gets removed.&lt;BR /&gt;&lt;BR /&gt;If you have a compression rate of 90% you would need 6 GB.&lt;BR /&gt;By a compression rate of 50% you would need 30 GB.&lt;BR /&gt;And if the file is not compressable at all you need the whole 60 GB free.&lt;BR /&gt;&lt;BR /&gt;To be sure you should have minimum the same size as the original file free.&lt;BR /&gt;</description>
      <pubDate>Tue, 18 May 2004 14:03:26 GMT</pubDate>
      <guid>https://community.hpe.com/t5/operating-system-hp-ux/compressing-large-files/m-p/3279874#M180242</guid>
      <dc:creator>Juergen Tappe</dc:creator>
      <dc:date>2004-05-18T14:03:26Z</dc:date>
    </item>
    <item>
      <title>Re: compressing large files</title>
      <link>https://community.hpe.com/t5/operating-system-hp-ux/compressing-large-files/m-p/3279875#M180243</link>
      <description>Hey Mukundan.&lt;BR /&gt;&lt;BR /&gt;If you decide to use gzip, make sure you have the proper version or it won't compress files bigger than 2GB.  The version that is shipped with HPUX does not support it and you have to get a later version form the HP Portal site to support 2GB or larger files...&lt;BR /&gt;&lt;BR /&gt;</description>
      <pubDate>Tue, 18 May 2004 14:15:44 GMT</pubDate>
      <guid>https://community.hpe.com/t5/operating-system-hp-ux/compressing-large-files/m-p/3279875#M180243</guid>
      <dc:creator>Ryan B</dc:creator>
      <dc:date>2004-05-18T14:15:44Z</dc:date>
    </item>
    <item>
      <title>Re: compressing large files</title>
      <link>https://community.hpe.com/t5/operating-system-hp-ux/compressing-large-files/m-p/3279876#M180244</link>
      <description>Hey Mukundan,&lt;BR /&gt;You can use 'compress' or 'gzip'.  If you have problems with disk space on the drive where your large files are stored, you should find a filesystem with available free space and create an archive directory to move the files to and then compress them there.&lt;BR /&gt;&lt;BR /&gt;You can set up a crontab entry to save off and compress the large files based on their age and then move the file to the archive directory.  The commands you would set up in the crontab woud be something like:&lt;BR /&gt;find &lt;CURRENT location=""&gt; -mtime +1 -exec compress -f {} \;&lt;BR /&gt;mv &lt;CURRENT location=""&gt;*.Z &lt;ARCHIVE directory=""&gt;&lt;/ARCHIVE&gt;&lt;/CURRENT&gt;&lt;/CURRENT&gt;</description>
      <pubDate>Tue, 18 May 2004 14:27:03 GMT</pubDate>
      <guid>https://community.hpe.com/t5/operating-system-hp-ux/compressing-large-files/m-p/3279876#M180244</guid>
      <dc:creator>Dani Seely</dc:creator>
      <dc:date>2004-05-18T14:27:03Z</dc:date>
    </item>
    <item>
      <title>Re: compressing large files</title>
      <link>https://community.hpe.com/t5/operating-system-hp-ux/compressing-large-files/m-p/3279877#M180245</link>
      <description>Hello Mukundan,&lt;BR /&gt;I assume this is your first experience on the ITRC forum as you did not award points to the forumers for the answers you were provided.  May I suggest that you take a look at the following link to learn about the points system in use here.  Thanks.&lt;BR /&gt;&lt;BR /&gt;&lt;A href="http://forums1.itrc.hp.com/service/forums/helptips.do?#28" target="_blank"&gt;http://forums1.itrc.hp.com/service/forums/helptips.do?#28&lt;/A&gt;&lt;BR /&gt;&lt;BR /&gt;Please read the article, assess the assistance you were provided by the forumers, then reward them.  Thanks!</description>
      <pubDate>Tue, 18 May 2004 22:48:20 GMT</pubDate>
      <guid>https://community.hpe.com/t5/operating-system-hp-ux/compressing-large-files/m-p/3279877#M180245</guid>
      <dc:creator>Dani Seely</dc:creator>
      <dc:date>2004-05-18T22:48:20Z</dc:date>
    </item>
    <item>
      <title>Re: compressing large files</title>
      <link>https://community.hpe.com/t5/operating-system-hp-ux/compressing-large-files/m-p/3279878#M180246</link>
      <description>It depends on what u r trying to compress.&lt;BR /&gt;when compress is invoked on a text file, u normally get a 75-80 % compression.&lt;BR /&gt;&lt;BR /&gt;So u can successfully compress a text file if you have 30 %(safer side) of the original text file, space left in ur filesystem.&lt;BR /&gt;&lt;BR /&gt;Regds,&lt;BR /&gt;&lt;BR /&gt;Kaps</description>
      <pubDate>Wed, 19 May 2004 01:14:58 GMT</pubDate>
      <guid>https://community.hpe.com/t5/operating-system-hp-ux/compressing-large-files/m-p/3279878#M180246</guid>
      <dc:creator>KapilRaj</dc:creator>
      <dc:date>2004-05-19T01:14:58Z</dc:date>
    </item>
    <item>
      <title>Re: compressing large files</title>
      <link>https://community.hpe.com/t5/operating-system-hp-ux/compressing-large-files/m-p/3279879#M180247</link>
      <description>Hello,&lt;BR /&gt;&lt;BR /&gt;60 Gb is larger than the average dbf. On what basis did you pick this size ? How large is you database ? Our is 700 GB datawarehouse and the production standard is 5 Gb dbf. I don't know what is the recommandation from Oracle though ...&lt;BR /&gt;&lt;BR /&gt;Cheers&lt;BR /&gt;&lt;BR /&gt;Nicolas</description>
      <pubDate>Wed, 19 May 2004 02:36:28 GMT</pubDate>
      <guid>https://community.hpe.com/t5/operating-system-hp-ux/compressing-large-files/m-p/3279879#M180247</guid>
      <dc:creator>Nicolas Dumeige</dc:creator>
      <dc:date>2004-05-19T02:36:28Z</dc:date>
    </item>
  </channel>
</rss>

