Operating System - HP-UX
1833879 Members
1796 Online
110063 Solutions
New Discussion

Identifying dirs with lots of files

 
SOLVED
Go to solution
Robert Funk_1
Contributor

Identifying dirs with lots of files

Hi, I'm a new unix admin. and am trying to identify directories on my HP-UX 11.00 server that contain large number of files so I can target them for nightly cleanup.

I've discovered the du command which is somewhat helpful, but would like a command /script that tells me the number of files in all directories.

Thank you...
6 REPLIES 6
Pete Randall
Outstanding Contributor

Re: Identifying dirs with lots of files

Something like this, perhaps:

echo "directory counts" > /tmp/dircount.out
for dir in `find /start_point -type d`
do
ll $dir |wc -l >> /tmp/dircount.out
done


Pete



Pete
Mark Grant
Honored Contributor

Re: Identifying dirs with lots of files

Robert

This script is not too much of a problem if you start with the "ls -lR" command but I'm concerned that it is a really uncommon thing to do. In most cases, files on a unix system that are good targets for deleteion are in /tmp and /var/tmp and perhaps users home directories.

I think what will be more useful for you is the "find" command. This will allow you search for large files, or files that haven't been modified in ages or files called "core" (very good target for deletion that one) and the deletion can be combined into one command. For example to delete all files on the system called core.

find / -name core -exec rm {} \;

Maybe you need to look at the man page for "find" and perhaps find where redundant files reside. I think number of files in a directory is a bit of a blunt tool.
Never preceed any demonstration with anything more predictive than "watch this"
Umapathy S
Honored Contributor

Re: Identifying dirs with lots of files

for dir in `find . -type d`
do
echo "$dir"; ls -l $dir|wc -l
done

HTH,
Umapathy
Arise Awake and Stop NOT till the goal is Reached!
A. Clay Stephenson
Acclaimed Contributor

Re: Identifying dirs with lots of files

I don't like to run find from the root directory because of the system load so I change this to run from the currend working directory.

#!/usr/bin/sh

typeset -i10 BIG=500 # defines what you call "lots of files"
find . -type d | while read X
do
typeset -10 KNT=$(ls -a ${X} | wc -l)
if [[ ${KNT} -ge ${BIG} ]]
then
echo "${X}\t${KNT}"
fi
done

You can change find . to find / to do this from the root directory. You could simplify this using find -type d -size +30000c but that is not qa good test because directories that have had many files removed will still be large.
If it ain't broke, I can fix that.
Brian Bergstrand
Honored Contributor

Re: Identifying dirs with lots of files

Each file/dir is linked to it's parent dir. So you can use find to search for dirs with only a certain # of links.

find / -type d -links +1000 -print

The link count includes directories and files, so this could find directories with 1000 sub-directories and no files, but that would be unusual.

HTH.
Sanjay_6
Honored Contributor
Solution

Re: Identifying dirs with lots of files

Hi,

Try this,

for dir in `find . -type d`
do
echo "$dir => `find $dir -type f |wc -l`"
done

Hope this helps.

Regds