Operating System - OpenVMS
1748104 Members
4651 Online
108758 Solutions
New Discussion юеВ

Re: Compression of very big files

 
Wim Van den Wyngaert
Honored Contributor

Compression of very big files

We currently keep Sybase database dumps on disk in zip archives. The zip archive is about 20% of the size of the dumps.

The zip however consumes lots of cpu (even with /level=1 about 60sec/250 MB).

Anyone a solution to compress with (a lot) less cpu consumption ?

Wim
Wim
25 REPLIES 25
Ian Miller.
Honored Contributor

Re: Compression of very big files

Which version of zip?
____________________
Purely Personal Opinion
Wim Van den Wyngaert
Honored Contributor

Re: Compression of very big files

2.1
Wim
Karl Rohwedder
Honored Contributor

Re: Compression of very big files

ZIP V2.31 is current.

We ZIP RDB backupfiles and I found, that BZIP2 compresses better and uses less resources (no hard data available at the moment).

regards Kalle
Wim Van den Wyngaert
Honored Contributor

Re: Compression of very big files

Correction 2.3.
Wim
Ian Miller.
Honored Contributor

Re: Compression of very big files

there is a beta version of a later zip out there somewhere or bzip2 is on the freeware
http://h71000.www7.hp.com/freeware/freeware70/000tools/alpha_images/bzip2.exe
____________________
Purely Personal Opinion
Karl Rohwedder
Honored Contributor

Re: Compression of very big files

Note that BZIP2 on the freeware is V1.0.1, whereas the version I use is 1.0.3a.
On the site http://antinode.org/dec/sw/bzip2.html
is a version 1.0.3b already, but I havn't used this one.

regards Kalle
Wim Van den Wyngaert
Honored Contributor

Re: Compression of very big files

Just tried 1.0.2.on a variable rec dumpof Sybase.

rms-f-irc illegal record encountered; vbn or record number = !ul
Wim
Wim Van den Wyngaert
Honored Contributor

Re: Compression of very big files

If I change the file to fix,512 it works (bug or feature ?). rfm=stm caused record too large for users' buffer.

My reference file of 320 MB is compressed in 518 cpu secs while it takes 81 secs with zip/level=1 (178 without /level which equals =5).

So, not the thing I was hoping for or a problem ?

Wim
Wim
Karl Rohwedder
Honored Contributor

Re: Compression of very big files

Just gave it a try (using a small RDB backup file). BZIP2 uses considerably more CPU but less IO and produces a far better result in regards of filesize (see attached textfile).
I used the 1.0.3b version.

regards Kalle