1826004 Members
3289 Online
109690 Solutions
New Discussion

script ..

 
UnixT
Frequent Advisor

script ..

Hello Guru,

I am in need of a scripts which is for removing the files that get from the out put of ls -ltr . There are many files and we should not use for loop also. I am requesting this for our dba.

Can any one help me here...
12 REPLIES 12
Michael Steele_2
Honored Contributor

Re: script ..

Hi

cd /dir
find . 'test' -exec rm {} \;

Regarding 'test': Need more information. If you read the man page on find you'll see many 'tests', based upon age, ownerships, etc.

Provide and I'll reciprocate.
Support Fatherhood - Stop Family Law
James R. Ferguson
Acclaimed Contributor

Re: script ..

Hi:

Your use of the '-t' option with 'ls' suggests that you might want to remove files older than some date/time. Look at the manpages for 'find'. You might want to do something like:

# find /path_to_files -type f -mtime +30 -exec rm {} +

...which will remove files that are older than 30-days since their last modification.

Regards!

...JRF...
UnixT
Frequent Advisor

Re: script ..

Hello,

My necessity is something different,....I will give u a scenario as below..

#ls -ltr

a
b
c
d
e
f
g

.
.
.
.
etc...

and I would like to remove the first few files(say for example as 4 files a,b,c,d). I feel it is enough to get you. In this , no ownership or aging comes in.
James R. Ferguson
Acclaimed Contributor

Re: script ..

Hi (again):

> and I would like to remove the first few files(say for example as 4 files a,b,c,d). I feel it is enough to get you. In this , no ownership or aging comes in.

OK, then you could do:

# cd /somepath && ls -t|head -4|xargs rm

This changes to the directory you want to be in and if successful removes the first four (4) files (most recently modified, first). If you want to remove the oldest files, do:

# cd /somepath && ls -rt|head -4|xargs rm

It is assumed that only files and no subdirectories are present.

Regards!

...JRF...


Regards!

...JRF...
R.K. #
Honored Contributor

Re: script ..

Hello UnixT,

You can use this:

--------------------
for I in `ls -lt | tail -4 | awk '{ print $9 }'`
do
rm -i $I
done
--------------------


rm -i <-- for interactive removal
Don't fix what ain't broke
UnixT
Frequent Advisor

Re: script ..

@R.K

I am not supposed to use for loop here..

@ James,


I will try the same and come back to u.

Thanks
Steven Schweda
Honored Contributor

Re: script ..

> I am not supposed to use for loop here..

Why not? Too easy?

> #ls -ltr
> [...] In this , no ownership or aging comes
> in.

What do you mean, "no [...] aging comes in"?
What do you think that "ls -ltr" does, if not
list files according to their ages? At least
one of us is confused about what you want.
UnixT
Frequent Advisor

Re: script ..

Hi Steven,


I told about the ages because of the upper thread was saying about it..please go thru that.

Next, why I am not using for loop is because ,my DBA does not allowing me to include the same.

Re: script ..

>> I am not using for loop is because ,my DBA does not allowing me to include the same.

hmmm I can think of a good reason not to use for loops in this manner, but "my DBA told me not to" isn't it! (Although your DBA may be thinking of the same reason that I am)

Do you not have any curiosity? Don't you want to know _why_ you shouldn't use a for loop? Curiosity is the key to learning!

HTH

Duncan

I am an HPE Employee
Accept or Kudo
Dennis Handly
Acclaimed Contributor

Re: script ..

rm $(ls -ltr | awk 'NR <= 4 { print $9 }')

>Duncan: I can think of a good reason not to use for loops in this manner

Because 4 is too small to bother? :-)

Re: script ..

>Because 4 is too small to bother? :-)

no actually because what if the filenames are _really_ long (or we have a lot more than 4 files - not valid for this example but a generic reason to avoid for loops) - we can run into the problem of exceeding the max line length or argument count for the shell.

instead of:

for x in
do
...
done

I would always do:

cat | while read x
do
...
done

(syntactically incorrect, but you get the gist)
that lesson learnt after debugging a script which deleted archive logs in a directory - it stopped working when there we more than 255 logs present...

Duncan

I am an HPE Employee
Accept or Kudo
Dennis Handly
Acclaimed Contributor

Re: script ..

>Duncan: actually because what if the filenames are _really_ long (or we have a lot more than 4 files...) we can run into the problem of exceeding the max line length or argument count for the shell.

In both cases, you can't use a for-loop nor my $(). Typically 4 * N will always be less than 2 Mb.

>I would always do: while read x
>(syntactically incorrect

You mean evil cat incorrect. :-)

Yes, while read or xargs.