- Community Home
- >
- Servers and Operating Systems
- >
- Operating Systems
- >
- Operating System - HP-UX
- >
- compressing large files
Categories
Company
Local Language
Forums
Discussions
Forums
- Data Protection and Retention
- Entry Storage Systems
- Legacy
- Midrange and Enterprise Storage
- Storage Networking
- HPE Nimble Storage
Discussions
Discussions
Discussions
Discussions
Forums
Forums
Discussions
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
- BladeSystem Infrastructure and Application Solutions
- Appliance Servers
- Alpha Servers
- BackOffice Products
- Internet Products
- HPE 9000 and HPE e3000 Servers
- Networking
- Netservers
- Secure OS Software for Linux
- Server Management (Insight Manager 7)
- Windows Server 2003
- Operating System - Tru64 Unix
- ProLiant Deployment and Provisioning
- Linux-Based Community / Regional
- Microsoft System Center Integration
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Community
Resources
Forums
Blogs
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО05-18-2004 06:28 AM
тАО05-18-2004 06:28 AM
i have question. do you forsee or have you experienced any problems with using large files (especially oracle data files) of sizes 60gb+?
how much free space do we need to compress a file of size 60gb so that the compression proceeds without running out of disk space?
i appreciate your response.
thanks
Mukundan
Solved! Go to Solution.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО05-18-2004 06:32 AM
тАО05-18-2004 06:32 AM
Re: compressing large files
Using gzip should be better but i don't know how much it needs
Hope this help
Cesare
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО05-18-2004 06:53 AM
тАО05-18-2004 06:53 AM
Re: compressing large files
During the compression process of very large files disk space is used as a buffer/build area so as a general rule twice the file size is required.
Paula
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО05-18-2004 07:03 AM
тАО05-18-2004 07:03 AM
SolutionDuring compress or gzip "file.Z" or "file.gz" growes and as soon the compress is finished "file" gets removed.
If you have a compression rate of 90% you would need 6 GB.
By a compression rate of 50% you would need 30 GB.
And if the file is not compressable at all you need the whole 60 GB free.
To be sure you should have minimum the same size as the original file free.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО05-18-2004 07:15 AM
тАО05-18-2004 07:15 AM
Re: compressing large files
If you decide to use gzip, make sure you have the proper version or it won't compress files bigger than 2GB. The version that is shipped with HPUX does not support it and you have to get a later version form the HP Portal site to support 2GB or larger files...
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО05-18-2004 07:27 AM
тАО05-18-2004 07:27 AM
Re: compressing large files
You can use 'compress' or 'gzip'. If you have problems with disk space on the drive where your large files are stored, you should find a filesystem with available free space and create an archive directory to move the files to and then compress them there.
You can set up a crontab entry to save off and compress the large files based on their age and then move the file to the archive directory. The commands you would set up in the crontab woud be something like:
find
mv
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО05-18-2004 03:48 PM
тАО05-18-2004 03:48 PM
Re: compressing large files
I assume this is your first experience on the ITRC forum as you did not award points to the forumers for the answers you were provided. May I suggest that you take a look at the following link to learn about the points system in use here. Thanks.
http://forums1.itrc.hp.com/service/forums/helptips.do?#28
Please read the article, assess the assistance you were provided by the forumers, then reward them. Thanks!
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО05-18-2004 06:14 PM
тАО05-18-2004 06:14 PM
Re: compressing large files
when compress is invoked on a text file, u normally get a 75-80 % compression.
So u can successfully compress a text file if you have 30 %(safer side) of the original text file, space left in ur filesystem.
Regds,
Kaps
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО05-18-2004 07:36 PM
тАО05-18-2004 07:36 PM
Re: compressing large files
60 Gb is larger than the average dbf. On what basis did you pick this size ? How large is you database ? Our is 700 GB datawarehouse and the production standard is 5 Gb dbf. I don't know what is the recommandation from Oracle though ...
Cheers
Nicolas