HPE GreenLake Administration
- Community Home
- >
- Servers and Operating Systems
- >
- Operating Systems
- >
- Operating System - Linux
- >
- Intermittant FTP Client Process Hang
Operating System - Linux
1830194
Members
7857
Online
109999
Solutions
Forums
Categories
Company
Local Language
back
Forums
Discussions
Forums
- Data Protection and Retention
- Entry Storage Systems
- Legacy
- Midrange and Enterprise Storage
- Storage Networking
- HPE Nimble Storage
Discussions
Forums
Discussions
Discussions
Discussions
Forums
Discussions
back
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
- BladeSystem Infrastructure and Application Solutions
- Appliance Servers
- Alpha Servers
- BackOffice Products
- Internet Products
- HPE 9000 and HPE e3000 Servers
- Networking
- Netservers
- Secure OS Software for Linux
- Server Management (Insight Manager 7)
- Windows Server 2003
- Operating System - Tru64 Unix
- ProLiant Deployment and Provisioning
- Linux-Based Community / Regional
- Microsoft System Center Integration
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Blogs
Information
Community
Resources
Community Language
Language
Forums
Blogs
Go to solution
Topic Options
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-20-2007 07:05 PM
07-20-2007 07:05 PM
I have an FTP process (just a few lines of shell script) to transfer a small (< 1MB) output file from my RHEL 4 server (USA) to a remote Windows server (Singapore). It works 99.x% of the time, but intermittently hangs.
There are known occasional network 'issues' at the remote site, and these are being worked on. *I* am not trying to fix any underlying network connectivity issues per se. It's OK for this transfer to fail (every now and then). Cron will run the process to send the most recent copy of the output file every 15 minutes. These file updates are not required to be real-time.
Here's the problem...
BAD: When a glitch occurs, my Linux FTP client process/script hangs... forever. The session does not time-out or return an error, it just sits there... like a blonde holding an abacus.
WORSE: When cron re-runs the process every 15 minutes (after a hang), the subsequent FTP sessions appear to run as-if the transfer is successful - the FTP session log looks correct; however, end users claim the file is not updated. Maybe the destination file remains locked or something? So far, the solution has been to manually kill the hung FTP process, after which everything resumes fine (every 15 mins)... until the next glitch occurs. I've now tired of this silliness, but I don't want to automate a kill processes if I can avoid it, so...
QUESTION: Anyone know how to make the Linux FTP client actually time-out or exit properly?
There are known occasional network 'issues' at the remote site, and these are being worked on. *I* am not trying to fix any underlying network connectivity issues per se. It's OK for this transfer to fail (every now and then). Cron will run the process to send the most recent copy of the output file every 15 minutes. These file updates are not required to be real-time.
Here's the problem...
BAD: When a glitch occurs, my Linux FTP client process/script hangs... forever. The session does not time-out or return an error, it just sits there... like a blonde holding an abacus.
WORSE: When cron re-runs the process every 15 minutes (after a hang), the subsequent FTP sessions appear to run as-if the transfer is successful - the FTP session log looks correct; however, end users claim the file is not updated. Maybe the destination file remains locked or something? So far, the solution has been to manually kill the hung FTP process, after which everything resumes fine (every 15 mins)... until the next glitch occurs. I've now tired of this silliness, but I don't want to automate a kill processes if I can avoid it, so...
QUESTION: Anyone know how to make the Linux FTP client actually time-out or exit properly?
Solved! Go to Solution.
2 REPLIES 2
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-20-2007 07:53 PM
07-20-2007 07:53 PM
Solution
There are a few things I can think of.
The first, and by far the easiest is to use a different FTP client, that supports various timeout settings. For instance, 'ncftp' supports 3 timeout settings; connect-timeout, control-timeout and xfer-timeout. You can set the client to exit if it doesn't get data within a fixed period.
The second, it's a bit more complicated and requires the use of 'expect' in order to control your FTP session. Expect can also be given timeouts of when given values are expected to occur, and thus can close filehandles and parent processes (i.e. the ftp client).
The third, it's a bit more difficult and requires a bit of funky shell-ness, but it also requires no extra software be installed. It does however depends on automated your FTP command is.
The third will look something akin to this, assuming your FTP command is fully automated with '.netrc':
ftp &
MYPID=$!
sleep 120
ps -opid= -p $MYPID > /dev/null && kill -TERM $MYPID
Short explanation of this:
- Throws the FTP into the background
- Record the PID.
- Sleep (timeout really)
- See if PID is still alive, and kill if it is
The first, and by far the easiest is to use a different FTP client, that supports various timeout settings. For instance, 'ncftp' supports 3 timeout settings; connect-timeout, control-timeout and xfer-timeout. You can set the client to exit if it doesn't get data within a fixed period.
The second, it's a bit more complicated and requires the use of 'expect' in order to control your FTP session. Expect can also be given timeouts of when given values are expected to occur, and thus can close filehandles and parent processes (i.e. the ftp client).
The third, it's a bit more difficult and requires a bit of funky shell-ness, but it also requires no extra software be installed. It does however depends on automated your FTP command is.
The third will look something akin to this, assuming your FTP command is fully automated with '.netrc':
ftp
MYPID=$!
sleep 120
ps -opid= -p $MYPID > /dev/null && kill -TERM $MYPID
Short explanation of this:
- Throws the FTP into the background
- Record the PID.
- Sleep (timeout really)
- See if PID is still alive, and kill if it is
One long-haired git at your service...
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-21-2007 10:00 AM
07-21-2007 10:00 AM
Re: Intermittant FTP Client Process Hang
The default time out at server side is 15 minutes and for the client it is 2 hours(man ftpd).
Try pusing the files every 30 minutes (depending on how much time does the ftp process takes to finish the one whihc was already triggered.)
I would say that would have to do some thing to do with time out settings.
Try pusing the files every 30 minutes (depending on how much time does the ftp process takes to finish the one whihc was already triggered.)
I would say that would have to do some thing to do with time out settings.
The opinions expressed above are the personal opinions of the authors, not of Hewlett Packard Enterprise. By using this site, you accept the Terms of Use and Rules of Participation.
Company
Events and news
Customer resources
© Copyright 2025 Hewlett Packard Enterprise Development LP