Operating System - HP-UX
1819803 Members
2865 Online
109607 Solutions
New Discussion юеВ

.gz file too large to uncompress

 
SOLVED
Go to solution
Jane F Lewicke
Advisor

.gz file too large to uncompress

I need to import an oracle table from a large .gz export. When I run the unzip command get an error that the file is too large to uncompress.

So I tried to import directly from the .gz file and that failed as well.

Is there some way to get around this error?
11 REPLIES 11
Michael Tully
Honored Contributor

Re: .gz file too large to uncompress

Hi,

When you unzip or decompress a file, you need the the amount of disk space availble for both the zipped file and the unzipped file, until the unzip has completed.

Suggest you either create more space where your doing it, or use a different filesystem to do the unzip.

HTH
Michael
Anyone for a Mutiny ?
John Poff
Honored Contributor

Re: .gz file too large to uncompress

Hi,

One way around it might be to use the gzcat utility and pipe the output directly into the Oracle import command, assuming that the import command is happy working just with standard input and not with a file.

Something like this:

gzcat

JP
T G Manikandan
Honored Contributor

Re: .gz file too large to uncompress

Make sure you have enough space on the file system on which you are trying to uncompress.
Is the file crossing the 2GB limit.

If so then enable largefiles on that file system
kish_1
Valued Contributor

Re: .gz file too large to uncompress

please check the file system which supports more 2 Gb, with following command
fasadm /dev/vg00/rlvol4
replace /vg00/rlvol4 according to your file system
share the power of the knowledge
kish_1
Valued Contributor

Re: .gz file too large to uncompress

sorry I gave you wrong command.


fsadm /dev/vg00/rlvol4

share the power of the knowledge
Jochen Heuer
Respected Contributor

Re: .gz file too large to uncompress

Hi.

I don't think the HP-UX gzip is capable of handling large files.

Try GNU gzip (freeware).

Jochen
Well, yeah ... I suppose there's no point in getting greedy, is there?
Jochen Heuer
Respected Contributor

Re: .gz file too large to uncompress

Btw. you can find GNU gzip on the porting site:

http://hpux.asknet.de/hppd/hpux/Gnu/gzip-1.3.3/
Well, yeah ... I suppose there's no point in getting greedy, is there?
Alexander M. Ermes
Honored Contributor

Re: .gz file too large to uncompress

Hi there.
We have had these problems as well.
Try to do this :

cat file.dmp.gz | gunzip > file.dmp


Your file system should be ablte to handle large files ( > 2 GB ).
Rgds
Alexander M. Ermes
.. and all these memories are going to vanish like tears in the rain! final words from Rutger Hauer in "Blade Runner"
Jane F Lewicke
Advisor

Re: .gz file too large to uncompress

That worked! Thank you!
Yogeeraj_1
Honored Contributor
Solution

Re: .gz file too large to uncompress

hi,

you can also try this to prevent the oracle dump files to exceed 2GB, hence multiple files which can esily be extracted.

use export to a PIPE and have compress and split read the pipe. The result is a couple of 500meg compressed files that consistute the export. At 500meg, any utility can deal with this files and can be moved around easier.

Here is the CSH script use to show you how it is done. It does a full export and then tests the integrity of the export by doing a full import show = y. that gives a file with all of my source code and ddl to boot.

#!/bin/csh -vx

setenv UID /
setenv FN exp.`date +%j_%Y`.dmp
setenv PIPE /tmp/exp_tmp_ora8i.dmp

setenv MAXSIZE 500m
setenv EXPORT_WHAT "full=y COMPRESS=n"

echo $FN

cd /nfs/atc-netapp1/expbkup_ora8i
ls -l

rm expbkup.log export.test exp.*.dmp* $PIPE
mknod $PIPE p

date > expbkup.log
( gzip < $PIPE ) | split -b $MAXSIZE - $FN. &
#split -b $MAXSIZE $PIPE $FN. &

exp userid=$UID buffer=20000000 file=$PIPE $EXPORT_WHAT >>& expbkup.log
date >> expbkup.log


date > export.test
cat `echo $FN.* | sort` | gunzip > $PIPE &
#cat `echo $FN.* | sort` > $PIPE &
imp userid=sys/o8isgr8 file=$PIPE show=y full=y >>& export.test
date >> export.test

tail expbkup.log
tail export.test

ls -l
rm -f $PIPE

------------ eof -------------------------


Hope this helps too!

Best Regards
Yogeeraj
No person was ever honoured for what he received. Honour has been the reward for what he gave (clavin coolidge)
Jane F Lewicke
Advisor

Re: .gz file too large to uncompress

Hi,

That does help -- it is a lot easaier when the usual commands work then others won't have the same problem in a pinch.

Thank you,
Jane