1832867 Members
3054 Online
110048 Solutions
New Discussion

search files

 
ivy1234
Frequent Advisor

search files

I would like to find the files that are over 10M in size in the system , I know the command "find" can do that , can advise is there any other command / method can do that ? as I know , "find" consumes high system resources .

Thx

6 REPLIES 6
Pete Randall
Outstanding Contributor

Re: search files

Any other command which recursively traverses the I/O subsystem is going to be resource intensive as well. You could come up with some sort of script, using ls -R, piped to awk to capture the size field, piped to a comparison to your criteria - but this is going to consume just as many resources as the (much simpler) find command.

find /startdir -type f -size +100000c |xargs ll

or something like that should do.


Pete

Pete
R.O.
Esteemed Contributor

Re: search files

You can run the find command by cron:

find / -size +10000000c -exec ls -l {} \; > textfile.txt

at low system usage hours. Later, you can review the textfile.

Regards,
"When you look into an abyss, the abyss also looks into you"
Pete Randall
Outstanding Contributor

Re: search files

Using the "-exec" syntax spawns a subprocess for each and that IS resource intensive. That is why I used the "|xargs" syntax.


Pete

Pete
Dennis Handly
Acclaimed Contributor

Re: search files

>Pete: Using the "-exec" syntax spawns a subprocess for each

Right, unless you use "+":
find /startdir -size +10000000c -exec ll {} +
bullz
Super Advisor

Re: search files

Yes, above commands will help you to find the files which are more than 10M files,

du -akx | sort -nr | more

In genral i use the above command more, to find the max utilization files.
Steven Schweda
Honored Contributor

Re: search files

> [...] |xargs ll

> [...] -exec ls -l {} [...]

> [...] -exec ll {} [...]

Who asked to see the file sizes?