Operating System - HP-UX
1748324 Members
4089 Online
108762 Solutions
New Discussion юеВ

Artificial limit of file sizes by row count?

 
A. Daniel King_1
Super Advisor

Artificial limit of file sizes by row count?

Hi, folks.

Is it possible to limit the size of a file based upon row count? I am aware of the default quota options (aggregate inode/size limits), but I do not know of a way to limit file size based upon number of lines - or individual file size.

It seems like this would be a complicated request - i.e., there is first an assumption of an ASCII file. Then, would this be something initiated at the file, directory or file-system level? Perhaps there are other complications as well.

However, I thought I'd ask before I said, "Impossible."

If enforcing a limit is impossible, I???ll give 10 points to the response with the most efficient alarm for files over 100,000 lines in a given file system.
Command-Line Junkie
7 REPLIES 7
Patrick Wallek
Honored Contributor

Re: Artificial limit of file sizes by row count?

I don't think this is possible with something like quotas.

As you say it would be very difficult to implement.
James R. Ferguson
Acclaimed Contributor

Re: Artificial limit of file sizes by row count?

Hi:

I don't know of anything that would "naturally" enforce this kind of limitaton -- total character size, yes, butnumber of lines, no.

I suppose you could write a script that periodically uses something like 'wc -l file' to interrogate file line counts and if exceeded do whatever it is you want.

Regards!

...JRF...
A. Daniel King_1
Super Advisor

Re: Artificial limit of file sizes by row count?

Would it be possible to limit individual file size? Something like the 2GB limit that requires you to turn on 'largefile' support?
Command-Line Junkie
James R. Ferguson
Acclaimed Contributor

Re: Artificial limit of file sizes by row count?

Hi (again):

You can disable 'largefiles' support (nolargefiles') on a filesystem basis, thereby limiting creation of files within that filesystem to 2GB. However, this does not address your 100,000 *line* limitation.

Regards!

...JRF...
A. Daniel King_1
Super Advisor

Re: Artificial limit of file sizes by row count?

2GB would be quite a few more than 100,000 lines!

I could probably use an average line length to calculate a byte-level limitation...
Command-Line Junkie
Darren Prior
Honored Contributor

Re: Artificial limit of file sizes by row count?

Hiya,

If it's the 2gig limit you're most worried about then a possible idea is to run a find from cron that checks for files of that size within specific directories. The downside is that it will be reactive rather than proactive, plus it's not efficient to run find constantly.

Another alternative is to have a separate filesystem that will accept largefiles, write the files in there, then copy/move the files to the correct place IF they are below 2gig.

regards,

Darren.
Calm down. It's only ones and zeros...
Pete Randall
Outstanding Contributor

Re: Artificial limit of file sizes by row count?

As far as your alarm is concerned, write a cron script to use find with wc to identify the culprits so you can take remedial action. Something like:

for FILE in `find /home`
do
if [ wc -l $FILE -gt 100000 ]
then
echo "naughty file " $FILE
fi
done

In case you hadn't guessed, scripting is not my strong suit but this oughtta do something for you.

Pete

Pete