Operating System - HP-UX
1840326 Members
3892 Online
110162 Solutions
New Discussion

if statement for file size

 
SOLVED
Go to solution
Ratzie
Super Advisor

if statement for file size

is there a simple if statement for a file size?

if [ file size -gt 60 ]
than
blah
fi

Doesnot work.
9 REPLIES 9
A. Clay Stephenson
Acclaimed Contributor
Solution

Re: if statement for file size

No directly: You need to leverage ls -l and awk (as but one method):

FNAME=myfile
typeset -i10 SZ=$(ls -l ${FNAME} | awk '{print $5}')
if [[ ${SZ} -gt 60 ]]
then
blah
blah
fi

The only built-in test is
if [[ -s ${FNAME} ]]
which tests id ${FNAME exists and has a size > 0.
If it ain't broke, I can fix that.
Hai Nguyen_1
Honored Contributor

Re: if statement for file size

You got spelling problem. Should have read:

if [ file size -gt 60 ]
then
blah
fi

Hai

Kent Ostby
Honored Contributor

Re: if statement for file size

ll | awk '{$5>60)'

Would get you all the files > then 60 bytes.

Best regards,

Kent Ostby
"Well, actually, she is a rocket scientist" -- Steve Martin in "Roxanne"
Dave La Mar
Honored Contributor

Re: if statement for file size

find file -size +61c >/dev/null
if [ $? -lt 1 ]
then
blah
fi

Best regards,

dl
"I'm not dumb. I just have a command of thoroughly useless information."
RAC_1
Honored Contributor

Re: if statement for file size

ll|awk '{if($5>=60) print}'

Will print files of 60 bytes and more.
There is no substitute to HARDWORK
Tom Smith_9
Frequent Advisor

Re: if statement for file size

Try this

LOGSIZE=$((`du -k $LOG | awk '{print $1}'`))

Assuming $LOG is the name of the file.

Hope it helps.
Tom Smith_9
Frequent Advisor

Re: if statement for file size

Sorry my last post was missing this,

if [ $(( $LOGSIZE > $LOGSIZELIMIT )) != 0 ] ; then
Hein van den Heuvel
Honored Contributor

Re: if statement for file size


In perl there is: -s

Usage example:

perl -e 'while (<*>) { print -s," $_\n" if (-s > 6000)}'

Hein.
Elmar P. Kolkman
Honored Contributor

Re: if statement for file size

One thing to bear in mind: most shell commands have problems calculating and perhaps comparing with values larger the 2G.
If your file might be in that range, you can alter Clay's shell solution to divide the size by 1024, for instance, to get the size in kb in your awk part (awk '/file$/{printf "%d\n",$5/1024}).

You could also use things like 'wc -b ', though I wouldn't recommend this for large files (it will literally count bytes in the file!) or 'du -k | cut -f1' to get the size in kb.
ls -s is also a solution, but will get you the size of the file in filesystemblocksize, which might not be what you want.

Every problem has at least one solution. Only some solutions are harder to find.