Operating System - Linux
1830194 Members
7857 Online
109999 Solutions
New Discussion

Intermittant FTP Client Process Hang

 
SOLVED
Go to solution
Jared Middleton
Frequent Advisor

Intermittant FTP Client Process Hang

I have an FTP process (just a few lines of shell script) to transfer a small (< 1MB) output file from my RHEL 4 server (USA) to a remote Windows server (Singapore). It works 99.x% of the time, but intermittently hangs.

There are known occasional network 'issues' at the remote site, and these are being worked on. *I* am not trying to fix any underlying network connectivity issues per se. It's OK for this transfer to fail (every now and then). Cron will run the process to send the most recent copy of the output file every 15 minutes. These file updates are not required to be real-time.

Here's the problem...

BAD: When a glitch occurs, my Linux FTP client process/script hangs... forever. The session does not time-out or return an error, it just sits there... like a blonde holding an abacus.

WORSE: When cron re-runs the process every 15 minutes (after a hang), the subsequent FTP sessions appear to run as-if the transfer is successful - the FTP session log looks correct; however, end users claim the file is not updated. Maybe the destination file remains locked or something? So far, the solution has been to manually kill the hung FTP process, after which everything resumes fine (every 15 mins)... until the next glitch occurs. I've now tired of this silliness, but I don't want to automate a kill processes if I can avoid it, so...

QUESTION: Anyone know how to make the Linux FTP client actually time-out or exit properly?
2 REPLIES 2
Stuart Browne
Honored Contributor
Solution

Re: Intermittant FTP Client Process Hang

There are a few things I can think of.

The first, and by far the easiest is to use a different FTP client, that supports various timeout settings. For instance, 'ncftp' supports 3 timeout settings; connect-timeout, control-timeout and xfer-timeout. You can set the client to exit if it doesn't get data within a fixed period.

The second, it's a bit more complicated and requires the use of 'expect' in order to control your FTP session. Expect can also be given timeouts of when given values are expected to occur, and thus can close filehandles and parent processes (i.e. the ftp client).

The third, it's a bit more difficult and requires a bit of funky shell-ness, but it also requires no extra software be installed. It does however depends on automated your FTP command is.

The third will look something akin to this, assuming your FTP command is fully automated with '.netrc':

ftp &
MYPID=$!
sleep 120
ps -opid= -p $MYPID > /dev/null && kill -TERM $MYPID

Short explanation of this:

- Throws the FTP into the background
- Record the PID.
- Sleep (timeout really)
- See if PID is still alive, and kill if it is
One long-haired git at your service...
skt_skt
Honored Contributor

Re: Intermittant FTP Client Process Hang

The default time out at server side is 15 minutes and for the client it is 2 hours(man ftpd).

Try pusing the files every 30 minutes (depending on how much time does the ftp process takes to finish the one whihc was already triggered.)

I would say that would have to do some thing to do with time out settings.