- Community Home
- >
- Servers and Operating Systems
- >
- Operating Systems
- >
- Operating System - HP-UX
- >
- Re: little doubt...
Categories
Company
Local Language
Forums
Discussions
Knowledge Base
Forums
- Data Protection and Retention
- Entry Storage Systems
- Legacy
- Midrange and Enterprise Storage
- Storage Networking
- HPE Nimble Storage
Discussions
Forums
Discussions
Discussions
Forums
Discussions
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Community
Resources
Forums
Blogs
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-05-2005 01:49 AM
12-05-2005 01:49 AM
little doubt...
when i execute "rm -rf *" in one folder, that folder contains 3,90,345 files. it is giving error like parameter can not handle. i am thinking that, * can not handle that huge number. it can handle up to some number. can somebody tell me up to what number, it can accept.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-05-2005 01:55 AM
12-05-2005 01:55 AM
Re: little doubt...
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-05-2005 01:55 AM
12-05-2005 01:55 AM
Re: little doubt...
Too many files for one delete. Too many arguments.
All programs, even rf have limits to the number of arguments. * is actually parsed into 3.9 million arguments in this case.
Here is a workaround.
ls -1 > filelist
while read -r filename
do
rm -f $filename
done < filelist
This will get it done.
Also: restructure your storage to prevent this many files from being in one folder. An ls command can take days to execute under these circumstances.
SEP
Owner of ISN Corporation
http://isnamerica.com
http://hpuxconsulting.com
Sponsor: http://hpux.ws
Twitter: http://twitter.com/hpuxlinux
Founder http://newdatacloud.com
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-05-2005 01:58 AM
12-05-2005 01:58 AM
Re: little doubt...
Check Bill's answer in this thread.
http://forums1.itrc.hp.com/service/forums/questionanswer.do?threadId=103792
You could use fine and xargs.
# find /folder -xdev | xargs rm {}
Best regards,
Robert-Jan
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-05-2005 02:05 AM
12-05-2005 02:05 AM
Re: little doubt...
It can handle up to this number.
Try using find with '-exec'; a new task is spawned for every file to remove.
# find /your_dir -name * -exec rm {} \;
Regards,
Sergejs
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-05-2005 02:38 AM
12-05-2005 02:38 AM
Re: little doubt...
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-05-2005 02:46 AM
12-05-2005 02:46 AM
Re: little doubt...
minimal delay try something like the
following. You will still have a delay on releasing space from the old files.
This will also release the space required
by the directory.
# Edit next three lines appriately
DIRNAME=x
MODE=640
OWNER=owner:group
mkdir newdir.$$
chmod ${MODE} newdir.$$
chown ${OWNER} newdir.$$
mv ${DIRNAME} ${DIRNAME}.$$
mv newdir.$$ ${DIRNAME}
find ${DIRNAME}.$$ -print0 | xargs -0 rm
rmdir ${DIRNAME}.$$
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-05-2005 02:46 AM
12-05-2005 02:46 AM
Re: little doubt...
It will never be instant but you could try
echo rm -r $(ls -1)
If it works remove the echo and do
rm -r $(ls -1)
Steve Steel
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-05-2005 02:46 AM
12-05-2005 02:46 AM
Re: little doubt...
Well, nothing is free. The the most efficient removal is probably going to be achieved by leveraging 'xargs' to bundle groups of file(names) for removal by 'rm'.
If you use '-exec' with a 'find' you are going to spawn a new task for every file to remove -- certainly very resource intensive.
Using 'rm' with one file at a time is probably going to be more costly than an 'xargs' solution too, since, again, a new process will need to be created for each file handled.
Regards!
...JRF...
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-05-2005 02:49 AM
12-05-2005 02:49 AM
Re: little doubt...
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-05-2005 02:54 AM
12-05-2005 02:54 AM
Re: little doubt...
J.Furgeson had the correct answer
ls | xargs -i rm {}
It'll take some time but will be much more efficient than looping through each file.
HTH;
Doug
------
Senior UNIX Admin
O'Leary Computers Inc
linkedin: http://www.linkedin.com/dkoleary
Resume: http://www.olearycomputers.com/resume.html