Operating System - HP-UX
1753926 Members
9223 Online
108810 Solutions
New Discussion юеВ

Re: deleting large # files

 
SOLVED
Go to solution
p7
Frequent Advisor

deleting large # files

hi all

i have a subdirectory with about 700000 files dating from 1999. its taking alot of space, but the client wants me to leave 2009, 2010 alone. does anyone know of an effeceint way to do this?

thx in advance
3 REPLIES 3
Pete Randall
Outstanding Contributor
Solution

Re: deleting large # files

The find command is typically the answer.

find /startdir -type f -mtime +730 |xargs rm

will remove all but files from the last 2 years (730 days). Adjust the number of days to give you the results you want and test by substituting ls instead of rm.


Pete

Pete
Patrick Wallek
Honored Contributor

Re: deleting large # files

However you do it, this is likely going to take a LONG TIME to run. Having to traverse 700,000 files is not trivial!
James R. Ferguson
Acclaimed Contributor

Re: deleting large # files

Hi:

You might find it faster to copy the two years worth of files you want to keep to another filesystem; 'newfs' the old filesystem (destroying everything therein) and copy the saved files back in.

If you can't do that, you might consider re-mounting your filesystem (assuming you have OnlineJFS) with mount options like '-o nolog,convosync=delay' for the duration of the deletions. This should improve performance. It would be good to have a filesystem backup, of course, before hand.

Regards!

...JRF...