1833767 Members
2152 Online
110063 Solutions
New Discussion

File Monitoring

 
Benjamin Cheong
Occasional Contributor

File Monitoring

Hi there,

Can anyone recommend a good way to have a perl or shell script that constantly monitors for the creation of new files and launches another process?

I'm thinking of using perl or shell, so that I can easily modify the code without recompiling if I need to add more files to be monitored.

Thanks!

Cheers,
Ben
6 REPLIES 6
Mark Grant
Honored Contributor

Re: File Monitoring

Unless you have a smallish set of files to look for, this will be quite a resource hog.

Anyway, if you have a list of a few files you want to check for their creation, it's quite simple.

while true
do
for files in `cat list_of_files`
do
[ -r $files ] && {
# do stuff because this file exists
}
done
sleep 60
done

list_of_files contains the full path name of files to check.

I can't help thinking that you are trying to achieve something that could probably be done a better way. Could you give a little more detail as to what you want.
Never preceed any demonstration with anything more predictive than "watch this"
Benjamin Cheong
Occasional Contributor

Re: File Monitoring

Hi Mark,

Many thanks for your response. I am actually looking at creating a process that monitors for the creation of a set of files.

Once any one of these files are created, I need to start off another process to start processing the file.

Hope that gives a clear enough picture of what I want to do.

Cheers,
Ben
Mark Grant
Honored Contributor

Re: File Monitoring

I'm just thinking out loud here but I always like to spend far too much time trying to think of a way to get the thing that creates a file to tell me about it in these cases.

For example, assuming these files arrive via ftp, I'd rather get the initiating machine to ftp the file then rexec/ssh/mail the receiving machine to start off the processing.

If the file names are known in advance, it might even be possible to have the names created as named pipes and then have something waiting on these files. This would be the best solution if ftp doesn't walk all over files that exist. Never tried it myself.

However, if we have no choice, a usual approach is to just make sure all files arrive in one directory and regualrly scan it for new files with "ls". It's a bit dull but it's simple.
Never preceed any demonstration with anything more predictive than "watch this"
V. Nyga
Honored Contributor

Re: File Monitoring

Hi Ben,

why you don't use cron jobs in SAM?

SAM area - Process management - Scheduled cron jobs

You can use shell - for example delete core files:
find . -type f -name core -exec rm {} \;

Instead of rm you can do anything behind -exec
In SAM you configure how often this cron shall run.
It works for simple monitoring and you can also edit the cron jobs file manually to add more files.

Regards
Volkmar
*** Say 'Thanks' with Kudos ***
V. Nyga
Honored Contributor

Re: File Monitoring

Hi again,

for the syntax check /var/spool/cron/crontab.root (for me under HP-UX 10.20)

To modify cron jobs:
/var/spool/cron/crontabs/root

Regards
Volkmar
*** Say 'Thanks' with Kudos ***
Benjamin Cheong
Occasional Contributor

Re: File Monitoring

Hi,

Thanks for the replies. I thought about using cron, but I actually need to be able to see if the process is running when I run top or ps -ef (which is why I thought about perhaps using perl).

I've tried the shell script, but I don't actually see the script that I launched. I just see the sleep. Is there anyway to have the name of the script show up during a top or ps?

Cheers,
Ben