- Community Home
- >
- Servers and Operating Systems
- >
- Operating Systems
- >
- Operating System - Linux
- >
- Re: Finding files to trim
Categories
Company
Local Language
Forums
Discussions
Forums
- Data Protection and Retention
- Entry Storage Systems
- Legacy
- Midrange and Enterprise Storage
- Storage Networking
- HPE Nimble Storage
Discussions
Discussions
Discussions
Forums
Forums
Discussions
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
- BladeSystem Infrastructure and Application Solutions
- Appliance Servers
- Alpha Servers
- BackOffice Products
- Internet Products
- HPE 9000 and HPE e3000 Servers
- Networking
- Netservers
- Secure OS Software for Linux
- Server Management (Insight Manager 7)
- Windows Server 2003
- Operating System - Tru64 Unix
- ProLiant Deployment and Provisioning
- Linux-Based Community / Regional
- Microsoft System Center Integration
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Community
Resources
Forums
Blogs
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО01-17-2006 11:28 AM
тАО01-17-2006 11:28 AM
/ on one of our systems is slowly growing. I have a two part question to ask the group....
1) Does anyone have a script that will look in / and find and list the largest files in / excluding all other mounted volumes? A script that would be portable or would require no editing as volumes are added/deleted would be preferred.
2) I was wondering if anyone had a script that would look for files to trim or delete on / or various other volumes (like /var or /opt). Is there a list of files or filetypes to look for that would qualify for occasional purging? This is for 11.11 systems.
Thanks for your help.
Solved! Go to Solution.
- Tags:
- find
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО01-17-2006 12:01 PM
тАО01-17-2006 12:01 PM
SolutionThe following script will list (and sum) the size of all directories except mounted ones:
# ls -il / | perl -nae 'if ($.>1 && $F[0]!=2 && $F[1]=~/^d/) {$v=`du -ks $F[9]`;print $v;@s=split(/\s+/,$v);$t+=$s[0]};END{print "$t\n"}'
This code relies on the fact that a directory that is a mountpoint will *always* have an inode=2. Remember, inode values are only unique within a filesystem.
Another way to list directory sizes is :
# du -xk /var|sort -k1nr|more
...This will restrict 'du' to reporting directory size in the order largest-to-smallest beneath a mountpoint.
The idea of both the above is to identify directories which have significant space.
As for looking for "large" files you could do something like:
# find /pathname -xdev -type f -size +10000c -exec ls -l {} \; | sort -k5nr
This would find all files in /pathname (without crossing mountpoints) whose size exceeds 10,000 characters. The output would be a stnadard 'ls' file listing sorted from largest to smallest by size.
As for trimming and/or deleting files in '/var/ and /'opt', the place to concentrate is '/var'. '/opt' should be static. See the manpages for 'hier(5)' for details.
That being said, files in:
/var/preserve --- are old 'vi' recovery files and if old can be deleted.
/var/adm/sw --- do *NOT* manage this manually. The *only* safe way to reclaim space is to use 'cleanup'. To do otherwise you run the risk of corrupting your Installed Product Database and you will not be able to patch your server again!
/var/adm/wtmp, /var/adm/btmp --- these are logs of successful and unsuccessful logins respectively. Trim them to zero by redirecting '/dev/null' to them or trim them with 'fwtmp' by converting them to ASCII to edit them and reconverting them to binary after they have been trimmed.
/var/adm/crash --- should only have unanalyzed crash dumps from high-priority machine checks. If you find something here that's old you can delete it unless you want HP Suppport to analyze it.
/var/adm/mail --- have your users cleanup their mail folders and this won't be a problem.
Regards!
...JRF...
- Tags:
- trim logfile
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО01-17-2006 02:30 PM
тАО01-17-2006 02:30 PM
Re: Finding files to trim
Wow...what a great response!!! That deserves 20 points!!!
I have another question along these lines...
I have a file that contains a list of directories. I want to get a sort these files by size so that the largest file is displayed first. Something like this...
find /{this is where I want to check the list of dirs in my file) -depth -type f -exec ll {} \; >> /tmp/file1
sort -r -n -k 5,5 /tmp/file1 > /tmp/file2
How can I write this so that it will go through my file and do the find on each of the entries in the file so I can then do the sort?
Thanks for the help.
- Tags:
- Sort
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО01-17-2006 02:36 PM
тАО01-17-2006 02:36 PM
Re: Finding files to trim
du -kx / | sort -n --> Will list dir sorted in ascending order
Once you note the dir that are using most of the space, run du on them again to see what is holding the space.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО01-17-2006 03:16 PM
тАО01-17-2006 03:16 PM
Re: Finding files to trim
I have a file...FILE1 that contains the following...
entry1
entry2
entry3
I want to run a find on the entries in the file so I can collect and sort the output.
Also getting all the information that a ls -l gives would help tremendously.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО01-17-2006 03:23 PM
тАО01-17-2006 03:23 PM
Re: Finding files to trim
(List the top 10 files)
# ls -lR | sort +4 -5nr | more
(List all files starting with biggest)
-Arun
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО01-17-2006 04:49 PM
тАО01-17-2006 04:49 PM
Re: Finding files to trim
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО01-17-2006 05:00 PM
тАО01-17-2006 05:00 PM
Re: Finding files to trim
find ${i} -exec ll -d {} \; | sort -nrk5
done
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО01-17-2006 05:04 PM
тАО01-17-2006 05:04 PM
Re: Finding files to trim
#du -xk `cat /home/hp/file1` |sort -rnk1 |pg
Where /home/hp/file1 is the file and you are executing the command at any location. The contents of the file are with absolute path like
/var
/opt
/home
HTH,
Devender
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО01-17-2006 11:46 PM
тАО01-17-2006 11:46 PM
Re: Finding files to trim
Perhaps this is what you are seeking (based upon your last post, above):
cat filelist.sh
#!/usr/bin/sh
while read LINE
do
find ${LINE} -xdev -type f | xargs ls -l
done < filelist | \
sort -k5nr
exit 0
...The 'filelist' is a simple file with directories of interest listed one per line. For example:
# cat filelist
/etc
/sbin
/stand
Note that you don't have to use any temporary files (only pipes) and that I have avoided the more expensive '-exec' with 'find' which would spawn a separate process for every 'ls' request. The 'xargs' bundles multiple requests and greatly reduces subprocess initiation. The 'find' traversing directories is brutal enough on any system!
Regards!
...JRF...
Regards!
...JRF...