1819688 Members
3462 Online
109605 Solutions
New Discussion юеВ

compressing large files

 
SOLVED
Go to solution
Mukundan_1
Occasional Contributor

compressing large files

Hp-UX gurus,
i have question. do you forsee or have you experienced any problems with using large files (especially oracle data files) of sizes 60gb+?

how much free space do we need to compress a file of size 60gb so that the compression proceeds without running out of disk space?
i appreciate your response.

thanks
Mukundan
8 REPLIES 8
Cesare Salvioni
Trusted Contributor

Re: compressing large files

I never used files so large but the command compress needs free space equal to the file size, when it gets this size for the compressed file it stops
Using gzip should be better but i don't know how much it needs

Hope this help
Cesare
Paula J Frazer-Campbell
Honored Contributor

Re: compressing large files

Hi

During the compression process of very large files disk space is used as a buffer/build area so as a general rule twice the file size is required.

Paula

If you can spell SysAdmin then you is one - anon
Juergen Tappe
Valued Contributor
Solution

Re: compressing large files

That depends on how good the file is compressable.

During compress or gzip "file.Z" or "file.gz" growes and as soon the compress is finished "file" gets removed.

If you have a compression rate of 90% you would need 6 GB.
By a compression rate of 50% you would need 30 GB.
And if the file is not compressable at all you need the whole 60 GB free.

To be sure you should have minimum the same size as the original file free.
Working together
Ryan B
Frequent Advisor

Re: compressing large files

Hey Mukundan.

If you decide to use gzip, make sure you have the proper version or it won't compress files bigger than 2GB. The version that is shipped with HPUX does not support it and you have to get a later version form the HP Portal site to support 2GB or larger files...

Dani Seely
Valued Contributor

Re: compressing large files

Hey Mukundan,
You can use 'compress' or 'gzip'. If you have problems with disk space on the drive where your large files are stored, you should find a filesystem with available free space and create an archive directory to move the files to and then compress them there.

You can set up a crontab entry to save off and compress the large files based on their age and then move the file to the archive directory. The commands you would set up in the crontab woud be something like:
find -mtime +1 -exec compress -f {} \;
mv *.Z
Together We Stand!
Dani Seely
Valued Contributor

Re: compressing large files

Hello Mukundan,
I assume this is your first experience on the ITRC forum as you did not award points to the forumers for the answers you were provided. May I suggest that you take a look at the following link to learn about the points system in use here. Thanks.

http://forums1.itrc.hp.com/service/forums/helptips.do?#28

Please read the article, assess the assistance you were provided by the forumers, then reward them. Thanks!
Together We Stand!
KapilRaj
Honored Contributor

Re: compressing large files

It depends on what u r trying to compress.
when compress is invoked on a text file, u normally get a 75-80 % compression.

So u can successfully compress a text file if you have 30 %(safer side) of the original text file, space left in ur filesystem.

Regds,

Kaps
Nothing is impossible
Nicolas Dumeige
Esteemed Contributor

Re: compressing large files

Hello,

60 Gb is larger than the average dbf. On what basis did you pick this size ? How large is you database ? Our is 700 GB datawarehouse and the production standard is 5 Gb dbf. I don't know what is the recommandation from Oracle though ...

Cheers

Nicolas
All different, all Unix