<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Oracle imp of remote compressed dump file. in Operating System - Linux</title>
    <link>https://community.hpe.com/t5/operating-system-linux/oracle-imp-of-remote-compressed-dump-file/m-p/3699110#M21131</link>
    <description>Why don't you share it via NFS?&lt;BR /&gt;&lt;BR /&gt;Sometimes, the uncompress command cannot handle files more than 2 GB. Use gzip/bzip2 instead.</description>
    <pubDate>Wed, 28 Dec 2005 13:08:51 GMT</pubDate>
    <dc:creator>Ivan Ferreira</dc:creator>
    <dc:date>2005-12-28T13:08:51Z</dc:date>
    <item>
      <title>Oracle imp of remote compressed dump file.</title>
      <link>https://community.hpe.com/t5/operating-system-linux/oracle-imp-of-remote-compressed-dump-file/m-p/3699109#M21130</link>
      <description>Hi,&lt;BR /&gt;&lt;BR /&gt;Due to no free space I have to do an import of a compressed export dump file stored at remote server.&lt;BR /&gt;&lt;BR /&gt;ServerA: (Export file Stored here)&lt;BR /&gt;/export/prod9i/exp_prod.dmp.Z (file size 6GB)&lt;BR /&gt;&lt;BR /&gt;ServerB: (Doing import here with following command)&lt;BR /&gt;&lt;BR /&gt;$remsh ServerA "nohup uncompress &amp;lt; /export/prod9i/exp_prod.dmp.Z &amp;gt;" | dd "of=/home/ora9i/dbjob/refresh/imp_pipe.dmp" &amp;amp;&lt;BR /&gt;&lt;BR /&gt;$imp system/abc123@test fromuser=aq touser=aq  rows=Y file=imp_pipe.dmp log=imp.log ignore=Y grants=N indexes=N \&lt;BR /&gt;constraints=N buffer=209715200 commit=Y recordlength=65535 &amp;amp;&lt;BR /&gt;&lt;BR /&gt;I have already configured the .rhosts file and remsh is working fine.&lt;BR /&gt;&lt;BR /&gt;&lt;BR /&gt;Is this on the fly uncompress command on the remote server is correct?&lt;BR /&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;BR /&gt;Thanks,&lt;BR /&gt;&lt;BR /&gt;Gulam.&lt;BR /&gt;</description>
      <pubDate>Wed, 28 Dec 2005 10:25:06 GMT</pubDate>
      <guid>https://community.hpe.com/t5/operating-system-linux/oracle-imp-of-remote-compressed-dump-file/m-p/3699109#M21130</guid>
      <dc:creator>Gulam Mohiuddin</dc:creator>
      <dc:date>2005-12-28T10:25:06Z</dc:date>
    </item>
    <item>
      <title>Re: Oracle imp of remote compressed dump file.</title>
      <link>https://community.hpe.com/t5/operating-system-linux/oracle-imp-of-remote-compressed-dump-file/m-p/3699110#M21131</link>
      <description>Why don't you share it via NFS?&lt;BR /&gt;&lt;BR /&gt;Sometimes, the uncompress command cannot handle files more than 2 GB. Use gzip/bzip2 instead.</description>
      <pubDate>Wed, 28 Dec 2005 13:08:51 GMT</pubDate>
      <guid>https://community.hpe.com/t5/operating-system-linux/oracle-imp-of-remote-compressed-dump-file/m-p/3699110#M21131</guid>
      <dc:creator>Ivan Ferreira</dc:creator>
      <dc:date>2005-12-28T13:08:51Z</dc:date>
    </item>
    <item>
      <title>Re: Oracle imp of remote compressed dump file.</title>
      <link>https://community.hpe.com/t5/operating-system-linux/oracle-imp-of-remote-compressed-dump-file/m-p/3699111#M21132</link>
      <description>Shalom Gulam,&lt;BR /&gt;&lt;BR /&gt;Its not going to work.&lt;BR /&gt;&lt;BR /&gt;Oracle import does work nicely on NFS or Samba mounts.&lt;BR /&gt;&lt;BR /&gt;Thats the way to go.&lt;BR /&gt;&lt;BR /&gt;SEP</description>
      <pubDate>Wed, 28 Dec 2005 14:44:37 GMT</pubDate>
      <guid>https://community.hpe.com/t5/operating-system-linux/oracle-imp-of-remote-compressed-dump-file/m-p/3699111#M21132</guid>
      <dc:creator>Steven E. Protter</dc:creator>
      <dc:date>2005-12-28T14:44:37Z</dc:date>
    </item>
  </channel>
</rss>

