HPE GreenLake Administration
- Community Home
- >
- Servers and Operating Systems
- >
- Operating Systems
- >
- Operating System - HP-UX
- >
- Getting errors from Cron jobs
Operating System - HP-UX
1833832
Members
2201
Online
110063
Solutions
Forums
Categories
Company
Local Language
back
Forums
Discussions
Forums
- Data Protection and Retention
- Entry Storage Systems
- Legacy
- Midrange and Enterprise Storage
- Storage Networking
- HPE Nimble Storage
Discussions
Forums
Discussions
Discussions
Discussions
Forums
Discussions
back
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
- BladeSystem Infrastructure and Application Solutions
- Appliance Servers
- Alpha Servers
- BackOffice Products
- Internet Products
- HPE 9000 and HPE e3000 Servers
- Networking
- Netservers
- Secure OS Software for Linux
- Server Management (Insight Manager 7)
- Windows Server 2003
- Operating System - Tru64 Unix
- ProLiant Deployment and Provisioning
- Linux-Based Community / Regional
- Microsoft System Center Integration
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Blogs
Information
Community
Resources
Community Language
Language
Forums
Blogs
Topic Options
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-14-2007 08:31 AM
09-14-2007 08:31 AM
Getting errors from Cron jobs
we have an application running that has several monitor scripts in crontabs. They historically have run fine. As of yesterday, we are getting messages like below... any ideas?
/export/home/prod//config.guess[34]: /dev/null: cannot create
(Unable to guess system type)
/export/home/prod/smc_smc/scripts/Monitor/CheckOutboundSmcInts.ss[1886351988]: cannot make pipe
qdump: Could not open file : File table overflow
/export/home/prod/smc_smc/scripts/Monitor/CheckOutboundSmcInts.ss[78]: cannot make pipe
*************************************************
Cron: The previous message is the standard output
and standard error of one of your crontab commands:
/export/home/prod/smc_smc/scripts/Monitor/CheckOutboundSmcInts.ss
/export/home/prod//config.guess[34]: /dev/null: cannot create
(Unable to guess system type)
/export/home/prod/smc_smc/scripts/Monitor/CheckOutboundSmcInts.ss[1886351988]: cannot make pipe
qdump: Could not open file : File table overflow
/export/home/prod/smc_smc/scripts/Monitor/CheckOutboundSmcInts.ss[78]: cannot make pipe
*************************************************
Cron: The previous message is the standard output
and standard error of one of your crontab commands:
/export/home/prod/smc_smc/scripts/Monitor/CheckOutboundSmcInts.ss
3 REPLIES 3
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-14-2007 08:45 AM
09-14-2007 08:45 AM
Re: Getting errors from Cron jobs
It appears that you are hitting the kernel limit nfile. This may be a result of too many processes or processes that didn't terminate or it may indicate heavy usage. Use sar -v 5 5 to see how near the file-sz limit you are.
If it ain't broke, I can fix that.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-14-2007 08:46 AM
09-14-2007 08:46 AM
Re: Getting errors from Cron jobs
"cannot create", "cannot make pipe", "could not open file", "file table overflow"
man errno would seem to indicate that you have reached the max number of open files configured in the kernel on the system in question.
options are probably shutdown some processes or update kernel parms to allow more open files
man errno would seem to indicate that you have reached the max number of open files configured in the kernel on the system in question.
options are probably shutdown some processes or update kernel parms to allow more open files
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-14-2007 08:49 AM
09-14-2007 08:49 AM
Re: Getting errors from Cron jobs
The key error, I believe, is "File table overflow". This means that you have reached the maximum number of open files on your system.
You will need to check your 'nfile' kernel parameter usage.
Do a:
# sar -v 5 5
And note the values in the 'file-sz' column. The value to the left of the '/' is the current usage. The value to the right is the limit. If the values are very close then you are probably occasionally hitting the limit.
To fix this you would need to increase the value of nfile, regenerate the kernel and reboot the system (11.11 and prior).
You will need to check your 'nfile' kernel parameter usage.
Do a:
# sar -v 5 5
And note the values in the 'file-sz' column. The value to the left of the '/' is the current usage. The value to the right is the limit. If the values are very close then you are probably occasionally hitting the limit.
To fix this you would need to increase the value of nfile, regenerate the kernel and reboot the system (11.11 and prior).
The opinions expressed above are the personal opinions of the authors, not of Hewlett Packard Enterprise. By using this site, you accept the Terms of Use and Rules of Participation.
Company
Events and news
Customer resources
© Copyright 2025 Hewlett Packard Enterprise Development LP