Operating System - Linux
1752656 Members
5820 Online
108788 Solutions
New Discussion юеВ

Re: Finding files to trim

 
SOLVED
Go to solution
TheJuiceman
Super Advisor

Finding files to trim

Hey gang,

/ on one of our systems is slowly growing. I have a two part question to ask the group....

1) Does anyone have a script that will look in / and find and list the largest files in / excluding all other mounted volumes? A script that would be portable or would require no editing as volumes are added/deleted would be preferred.

2) I was wondering if anyone had a script that would look for files to trim or delete on / or various other volumes (like /var or /opt). Is there a list of files or filetypes to look for that would qualify for occasional purging? This is for 11.11 systems.

Thanks for your help.
15 REPLIES 15
James R. Ferguson
Acclaimed Contributor
Solution

Re: Finding files to trim

Hi:

The following script will list (and sum) the size of all directories except mounted ones:

# ls -il / | perl -nae 'if ($.>1 && $F[0]!=2 && $F[1]=~/^d/) {$v=`du -ks $F[9]`;print $v;@s=split(/\s+/,$v);$t+=$s[0]};END{print "$t\n"}'

This code relies on the fact that a directory that is a mountpoint will *always* have an inode=2. Remember, inode values are only unique within a filesystem.

Another way to list directory sizes is :

# du -xk /var|sort -k1nr|more

...This will restrict 'du' to reporting directory size in the order largest-to-smallest beneath a mountpoint.

The idea of both the above is to identify directories which have significant space.

As for looking for "large" files you could do something like:

# find /pathname -xdev -type f -size +10000c -exec ls -l {} \; | sort -k5nr

This would find all files in /pathname (without crossing mountpoints) whose size exceeds 10,000 characters. The output would be a stnadard 'ls' file listing sorted from largest to smallest by size.

As for trimming and/or deleting files in '/var/ and /'opt', the place to concentrate is '/var'. '/opt' should be static. See the manpages for 'hier(5)' for details.

That being said, files in:

/var/preserve --- are old 'vi' recovery files and if old can be deleted.

/var/adm/sw --- do *NOT* manage this manually. The *only* safe way to reclaim space is to use 'cleanup'. To do otherwise you run the risk of corrupting your Installed Product Database and you will not be able to patch your server again!

/var/adm/wtmp, /var/adm/btmp --- these are logs of successful and unsuccessful logins respectively. Trim them to zero by redirecting '/dev/null' to them or trim them with 'fwtmp' by converting them to ASCII to edit them and reconverting them to binary after they have been trimmed.

/var/adm/crash --- should only have unanalyzed crash dumps from high-priority machine checks. If you find something here that's old you can delete it unless you want HP Suppport to analyze it.

/var/adm/mail --- have your users cleanup their mail folders and this won't be a problem.

Regards!

...JRF...
TheJuiceman
Super Advisor

Re: Finding files to trim

Hey James,

Wow...what a great response!!! That deserves 20 points!!!

I have another question along these lines...

I have a file that contains a list of directories. I want to get a sort these files by size so that the largest file is displayed first. Something like this...

find /{this is where I want to check the list of dirs in my file) -depth -type f -exec ll {} \; >> /tmp/file1

sort -r -n -k 5,5 /tmp/file1 > /tmp/file2

How can I write this so that it will go through my file and do the find on each of the entries in the file so I can then do the sort?

Thanks for the help.
RAC_1
Honored Contributor

Re: Finding files to trim

There are some simple commands that will do the work. You need to be careful about trimming the files. you should understnad what are those files and what processes might be accessing them.

du -kx / | sort -n --> Will list dir sorted in ascending order

Once you note the dir that are using most of the space, run du on them again to see what is holding the space.
There is no substitute to HARDWORK
TheJuiceman
Super Advisor

Re: Finding files to trim

Is there way to get a sort of just the files? Here is what I'm trying to do...

I have a file...FILE1 that contains the following...

entry1
entry2
entry3

I want to run a find on the entries in the file so I can collect and sort the output.

Also getting all the information that a ls -l gives would help tremendously.
Arunvijai_4
Honored Contributor

Re: Finding files to trim

# du -skx /usr/* | sort -nr | head -10

(List the top 10 files)

# ls -lR | sort +4 -5nr | more

(List all files starting with biggest)

-Arun
"A ship in the harbor is safe, but that is not what ships are built for"
TheJuiceman
Super Advisor

Re: Finding files to trim

I REALLY REALLY need this to do the find on the entries in the file. How can I do that? Thanks.
RAC_1
Honored Contributor

Re: Finding files to trim

for i in $(do
find ${i} -exec ll -d {} \; | sort -nrk5
done
There is no substitute to HARDWORK
Devender Khatana
Honored Contributor

Re: Finding files to trim

Hi,

#du -xk `cat /home/hp/file1` |sort -rnk1 |pg

Where /home/hp/file1 is the file and you are executing the command at any location. The contents of the file are with absolute path like

/var
/opt
/home

HTH,
Devender
Impossible itself mentions "I m possible"
James R. Ferguson
Acclaimed Contributor

Re: Finding files to trim

Hi (again):

Perhaps this is what you are seeking (based upon your last post, above):

cat filelist.sh
#!/usr/bin/sh
while read LINE
do
find ${LINE} -xdev -type f | xargs ls -l
done < filelist | \
sort -k5nr
exit 0

...The 'filelist' is a simple file with directories of interest listed one per line. For example:

# cat filelist
/etc
/sbin
/stand

Note that you don't have to use any temporary files (only pipes) and that I have avoided the more expensive '-exec' with 'find' which would spawn a separate process for every 'ls' request. The 'xargs' bundles multiple requests and greatly reduces subprocess initiation. The 'find' traversing directories is brutal enough on any system!

Regards!

...JRF...
Regards!

...JRF...