1748003 Members
4654 Online
108757 Solutions
New Discussion юеВ

Re: Compressing files

 
Bruno Cunha
Frequent Advisor

Re: Compressing files

Hello, you can compress all the files except the last one, in this way if Oracle is still writing to the file you don't compress it, or, you can check the size of each file before compression, if the file is in the maximum size (200mb in your case) then Oracle as finish writing.

Bruno Cunha
Do it today, don't leave for tomorrow
Jean-Luc Oudart
Honored Contributor

Re: Compressing files

fuser -fu will give list of processes against this file.

In case of recovery, you would have to uncompress the file(s) as they're requested.
This might be a bit tricky if you don't have the space for it !

My opinion would be that you should have the space for the archive log files.
fiat lux
Bill Thorsteinson
Honored Contributor

Re: Compressing files

You could use find with a
timestamp file to compress
all uncomressed files older
than a timestamp file.
Something like:

touch compress.$$

sleep 30

find . -name arch\*.arc ! -newer compress.$$ -exec compress {} \;

rm compress.$$

Aziz Zouagui
Frequent Advisor

Re: Compressing files


Well, I found a much safer solution that I tested and it works great.

First the cron job gets a list of all the files to compress then I check each file against the v$archived_log table in oracle.

If the file is properly archived and oracle is done with it, the Archiver process makes an entry into this table, making it safe to move or compress the file.

I have tested this by issuing an "Alter system switch logfile" right after that I started watching the directory as the file makes to the archive directory and at the same time I query the V$Archived_log table for the existence of the file. The table never showed the file until oracle finished archiving then an entry popped up in the table.

so the cron job never touches a file if it is not in the V$ARCHIVED_LOG table. If anyone is interested in the script, I will be glad to send it to them.

By the way, to answer the other questions why not to give enough space. This database generates close to 50 Gb of archive logs a day.
The directory that holds all of this is 40 Gb. Compressing and sending the files to a stand by database database in another server and eventually to tape, is the best way to have space available for the next day.

Again, many thanks to everyone who contributed their time and intellect to assist in solving this problem

Thank you.
Thomas Yake
Occasional Advisor

Re: Compressing files

Regardless of your process issues, you might want to replace the compress command with gzip:


itax_dev /tmp/algo # cp mail.log mail.log.preZ.txt
itax_dev /tmp/algo # cp mail.log mail.log.pregz.txt
itax_dev /tmp/algo # ls -arltp
total 7698
drwxrwxrwx 19 bin bin 6144 May 15 09:07 ../
-r--r----- 1 ty hpadmin 1310919 May 15 09:07 mail.log
-r--r----- 1 ty hpadmin 1310919 May 15 09:07 mail.log.preZ.txt
drwxr-x--- 2 ty hpadmin 96 May 15 09:07 ./
-r--r----- 1 ty hpadmin 1310919 May 15 09:07 mail.log.pregz.txt
itax_dev /tmp/algo # gzip mail.log.pregz.txt
itax_dev /tmp/algo # compress mail.log.preZ.txt
itax_dev /tmp/algo # lt
total 3168
drwxrwxrwx 19 bin bin 6144 May 15 09:07 ../
-r--r----- 1 ty hpadmin 1310919 May 15 09:07 mail.log
-r--r----- 1 ty hpadmin 203755 May 15 09:07 mail.log.preZ.txt.Z
-r--r----- 1 ty hpadmin 98570 May 15 09:07 mail.log.pregz.txt.gz
drwxr-x--- 2 ty hpadmin 1024 May 15 09:08 ./
itax_dev /tmp/algo
Aziz Zouagui
Frequent Advisor

Re: Compressing files


Great info, you mean gzip compresses data much more than compress ?

I didn't know that, I thought they are almost the same. But, clearly from your example, it looks like the gzip condensed more data than compress did. I will definitely have to switch from compress to gzip.

Great input.