Operating System - HP-UX
1827295 Members
2078 Online
109960 Solutions
New Discussion

Re: Gzipping Oracle archive logs

 
SOLVED
Go to solution
phil cook
Frequent Advisor

Gzipping Oracle archive logs

Hi

I'm running an Oracle database on a machine where space is a major consideration. Intermittantly huge batch loads occur on this machine & have on occasion blown away the archive logs filesystem. Currently I run a cron job (find) to gzip these archive logs every fifteen minutes if they are more than a day old. What I'd like to do is gzip every file in that area apart from the one that Oracle may currently be writing to. I could simply add a gzip * to cron but am afraid of the consequences should this clash with a file being written. I'm sure there is a fairly simple solution but I'm afraid I'm a bit slow. Can anybody suggest a good method?

Thanks

Phil
Do I have to?
10 REPLIES 10
Sandor Horvath_2
Valued Contributor
Solution

Re: Gzipping Oracle archive logs

Hi !

Try
gzip `ls -t | tail +2`

by Saa
If no problem, don't fixed it.
Alexander M. Ermes
Honored Contributor

Re: Gzipping Oracle archive logs

Hi there.
I attached a script, that we usefor some time now.
I hope, it helps.
Rgds
Alexander M. Ermes
.. and all these memories are going to vanish like tears in the rain! final words from Rutger Hauer in "Blade Runner"
Suhas_2
Regular Advisor

Re: Gzipping Oracle archive logs

Phil,
Here are my two cents...
1> suppose you want to gzip /ora1/*
2> use "find" with "newer" option to find those files.
You can generate the template file with
"touch -t filename" command.
3> After finding those files, zipping them is simplest task.

Hope this helps...
Suhas.
Never say "Die"
phil cook
Frequent Advisor

Re: Gzipping Oracle archive logs

Thanks to you all for your suggestions - have used the first one (ls -t | tail etc) & this is working just fine.

Regards
Do I have to?
Dan Hetzel
Honored Contributor

Re: Gzipping Oracle archive logs

Hi Phil,

To avoid 'gzipping' the files already zipped, you may use:
gzip `ls -t !(*.gz)| tail +2`
meaning "list all files not ending in .gz"

This works in Posix and Korn shell

Best regards,

Dan
Everybody knows at least one thing worth sharing -- mailto:dan.hetzel@wildcroft.com
phil cook
Frequent Advisor

Re: Gzipping Oracle archive logs

Cheers Dan - I am using the method in the form

gzip `ls -t /live/oraarch/*ARC | tail +2`

which is giving the same result I'd guess.

Yours does look a bit neater though.

Thanks
Do I have to?
Alan Riggs
Honored Contributor

Re: Gzipping Oracle archive logs

Just FYI, your concern about zipping an active oracle archive log might be unnecessary. Speak to your DBA, but a standard way of oracle archiving involves a rotation of 3 archive logs. the archive is actually kept in oracle's shared memory segment. As one log fills, data rolls over smoothly into the next, and the full log is written into the archive directory. So, if you look at the directory at any given time, you will see only files that are "full" (the exact size your DBA has configured for archive rollover). The "active" log is in the SGA.
Don Bentz
Regular Advisor

Re: Gzipping Oracle archive logs

FYI - MY DBA tells me that even though the archived redo_logs APPEAR full, the database has simply allocated a file according to its required size and may still be actively writing to it. The filesize may not change from the original allocation. If you delete the "latest" file, you could run the risk of copying/gzipping/deleting an incomplete copy. I would welcome any authoritative information to the contrary.
Insecurity is our friend. It keeps you dependent.
Alan Riggs
Honored Contributor

Re: Gzipping Oracle archive logs

Well, I don't want to get into a DBA pissing match, so I will simply note that running wc against the newly created archive logs as they appear should shed light the question in your environment. On my system, comparing the last two created logs yields:

963580 2309305 104858624 arch_36645.log
954027 2454387 104858624 arch_36646.log

This seems to support the idea of a "full" archive log being deposited in my environment. YMMV, of course.
John Palmer
Honored Contributor

Re: Gzipping Oracle archive logs

There will be finite amount of time during which the oracle arch process copies the latest online log file to the archived log directory.

This will vary due to a number of factors such as the logfile size and you do need to cater for it in your script.

What I do in my script is to check if the log is being written with the 'fuser' command. Something like:-

LOGS=$(ls arch*.log 2>/dev/null)

for LOG in ${LOGS}
do
if [[ -n $(fuser ${LOG} 2>/dev/null) ]];
then audit "Log ${LOG} is in use, ignoring..."
continue
fi

audit "compressing ${LOG}"
/usr/contrib/bin/gzip ${LOG}
# anything else you want to do to the log
done

I've just checked one of my servers and in the last three weeks, out of 523 logs being compressed, one was in use when the script was run (every 5 minutes).

Regards,
John