- Community Home
- >
- Servers and Operating Systems
- >
- Operating Systems
- >
- Operating System - Linux
- >
- more scripting help on calculations
Categories
Company
Local Language
Forums
Discussions
Forums
- Data Protection and Retention
- Entry Storage Systems
- Legacy
- Midrange and Enterprise Storage
- Storage Networking
- HPE Nimble Storage
Discussions
Discussions
Discussions
Forums
Forums
Discussions
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
- BladeSystem Infrastructure and Application Solutions
- Appliance Servers
- Alpha Servers
- BackOffice Products
- Internet Products
- HPE 9000 and HPE e3000 Servers
- Networking
- Netservers
- Secure OS Software for Linux
- Server Management (Insight Manager 7)
- Windows Server 2003
- Operating System - Tru64 Unix
- ProLiant Deployment and Provisioning
- Linux-Based Community / Regional
- Microsoft System Center Integration
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Community
Resources
Forums
Blogs
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО06-13-2006 09:38 PM
тАО06-13-2006 09:38 PM
I have a filesystem where the data is all over the place and no archiving or housekeeping has happened for some time. I have a definitive list of all files / data users etc.
#!/bin/ksh
# This script will check a file where find . -xargs |ls -ld |sort -rnk 5 > usrdump.lst
# is run and the definitive list has been created.
# Create list of unique users - you can store this in an array, or better in a temp file.
cat usrdump.lst | tail +1 |awk '{print $3}' |sort |uniq > usrlist.log
# for each user in list above print file size and add it up
for usrname in `awk '{print $1}' usrlist.log`
do
echo " checking user $usrname"
filesize_count=0
for filesize in `grep "$usrname " usrdump.lst |awk '{print $5}'`
do
filesize_count=$filesize_count+$filesize
done
echo "$usrname\ttotal\t$filesize"
done
the problem I have is that the usrdump file is over 120mb and the system runs out of memory:
--> ./filesystemcounter.sc
checking user cronlog
./filesystemcounter.sc[15]: 0403-029 There is not enough memory available now.
there is 16gb of memory in the system.
Anyone have a better solution to calculate the filesystem usage for all users accessing the FS?
Many Thanks again.
Solved! Go to Solution.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО06-13-2006 09:48 PM
тАО06-13-2006 09:48 PM
Re: more scripting help on calculations
How about someting like this.
# du -ks * /home | awk '{ print int($1/1024+.5)" MB"" "$2}'
Regards,
Robert-Jan
- Tags:
- awk
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО06-13-2006 10:03 PM
тАО06-13-2006 10:03 PM
Re: more scripting help on calculations
You can do
for i in `awk '{print $1}' usrlist.log`
do
echo "$i uses \c"
find ./ -type f -user $i | xargs ls -l | awk 'BEGIN {tot=0} {tot+=$5} END {print tot/1024,"KB"}'
done
Regards,
Ninad
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО06-13-2006 10:04 PM
тАО06-13-2006 10:04 PM
Re: more scripting help on calculations
I have the lsit of all files
# find . -xargs |ls -ld |sort -rnk 5 > usrdump.lst
however I need to then name and shame the users who have the most fs usage and I want to use the sript to prove this.
Thanks
- Tags:
- find
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО06-13-2006 10:20 PM
тАО06-13-2006 10:20 PM
Re: more scripting help on calculations
for USER in `awk '{print $1}' usrlist.log`
do
echo " checking user $USER"
echo "$USER \c"; find . -type f -user $USER -xdev|xargs ll|awk '{ x += $5 } END { print "total bytes: " x }'
done
Enrico
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО06-13-2006 10:22 PM
тАО06-13-2006 10:22 PM
Re: more scripting help on calculations
I was hoping I could interogate the usrdump.lst file to work out the sizes.
is that possible also?
TY
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО06-13-2006 10:45 PM
тАО06-13-2006 10:45 PM
SolutionAny how try this
for user in `awk '{print $1}' usrlist.log`
do
echo "$user uses \c"
grep $user usrdump.lst | awk 'BEGIN {tot=0} {tot+=$5} END {print tot/1024,"KB"}'
done
Regards,
Ninad
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО06-14-2006 12:11 AM
тАО06-14-2006 12:11 AM
Re: more scripting help on calculations
If you want to quickly summarize the utilization (total characters) by user within any filesystem, I'd use Perl. Something like this should do:
# cat ./userutil
#!/usr/bin/perl
#@(#)userutil $ Summarize by user total file sizes - JRF $
use strict;
use warnings;
use File::Basename;
use File::Find;
my %util;
my ($uid, $size, $name);
my $path = @ARGV ? shift : ".";
sub wanted {
return unless -f;
($uid, $size) = ((stat($_))[4,7]);
return unless $uid > 10;
$util{$uid}+=$size;
}
File::Find::find(\&wanted, $path);
for $uid (keys %util) {
$name = getpwuid($uid);
$name = $uid unless defined $name;
printf "%8s %10d\n",$name,$util{$uid};
}
1;
Users whose 'uid' is less than ten (10) are skipped. This eliminates files owned by "root" and "bin", in particular.
Only files, not directories, are examined. Run the script passing a directory (filesystem), e.g. "/home":
# ./userutil /tmp
spfmweb 175
oper 5026
jrf 48
Regards!
...JRF...
- Tags:
- Perl