1834504 Members
3187 Online
110068 Solutions
New Discussion

System Administration

 
SOLVED
Go to solution
MARIE ELDRIDGE
New Member

System Administration

Is there an easy way to locate and maintain system and application logs so you don't run out of disk space? I would like to run a cron job to perform this maintenance.

Constant learning keeps you on your toes and makes life interesting.
8 REPLIES 8
Andy Monks
Honored Contributor

Re: System Administration

'sam' as an option do to this. just does some of them however.
Rick Garland
Honored Contributor

Re: System Administration

There are many tools out there for this purpose.
SAM is one good tool to keep on top of this. Also in the porting archive
is a trimlog binary. You can do the configs for what is to be done and all that good stuff. You could also write a script to do the tasks. Be careful though. some of the log files will need to be handled with care. Example is the syslog.log file. You may want to cp it somewhere then cat /dev/null > syslog.log because it is an open file. You may want to kill -HUP `cat /var/run/syslog.pid` to have it restart as well.
CHRIS_ANORUO
Honored Contributor

Re: System Administration

YES for example, put this line in your cron file:
0 08 27 12 * cat /dev/null > /var/adm/wtmp

Cheers!


When We Seek To Discover The Best In Others, We Somehow Bring Out The Best In Ourselves.
James Odak
Valued Contributor
Solution

Re: System Administration

Assuming all logs end in .log or .LOG
write a script like this and add it to your cron

find / -depth -print|grep -i .log$ > /tmp/loglist

for X in `cat /tmp/loglist`
do
tail -50 $X > /tmp/templog
rm $X
mv /tmp/templog $X
done

This is the most simple way
you should add an if statement making sure /tmp/templog exists before removing and copying over the log file

you can also add -size +2000 to the find
this will limit the truncating of log files to just large ones

add a line with in the for statement

echo $X >> somefile

to keep a log of what files were clipped

you can also not delete but rename you log files each day

mv $X $X`date "+%m%d%y"`.log
then after the for statement run another find command
find / -depth -print -atime +x -exec rm {} /;

x being the number of days old the file deleted should be

i am sure there are more in depth ways to do this for people who script much better than i

and remember use sam for system files its better and much eaiser
James Odak
Valued Contributor

Re: System Administration

um when i put in find / blah blah
i should of said /dir of the dir your application is in
if you do just / you will get the.log of the system files that sam would take care of for you

sorry

Jim
MARIE ELDRIDGE
New Member

Re: System Administration

Thanks everyone for your help. Sorry I am late in responding, but I was out sick for a while. I never knew I had so many log files to keep track of.

Thanks again!
Constant learning keeps you on your toes and makes life interesting.
Alan Riggs
Honored Contributor

Re: System Administration

One thing to watch for in the scripted example: some logs have open file pointers at all times, which means that removing the file and moving another into place will corrupt the pointer reference (meaning you lose logging until the appropriate process is recycled). A safer way is:

tail -50 $LOGFILE > $TMPFILE
cat $TMPFILE > $LOGFILE

Note: tailing from the logical EOF is dangerous if you really want to preserve a large chunk of information, since the buffer will only hold 20K. That shouldn't be a worry for this problem, though.
Bill Hassell
Honored Contributor

Re: System Administration

There are two script samples: dailylogs.sh and weeklylogs.sh located at:

ftp://contrib:9unsupp8@hprc.external.hp.com/sysadmin/cronjobs

The dailylogs script will trim the fast growing files, where weeklylogs will trim a given logfile if it exceeds a set value. Each script is designed to be expanded and modified for local situations.


Bill Hassell, sysadmin