- Community Home
- >
- Servers and Operating Systems
- >
- Operating Systems
- >
- Operating System - HP-UX
- >
- file: table is full
Categories
Company
Local Language
Forums
Discussions
Forums
- Data Protection and Retention
- Entry Storage Systems
- Legacy
- Midrange and Enterprise Storage
- Storage Networking
- HPE Nimble Storage
Discussions
Forums
Discussions
Discussions
Discussions
Forums
Discussions
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
- BladeSystem Infrastructure and Application Solutions
- Appliance Servers
- Alpha Servers
- BackOffice Products
- Internet Products
- HPE 9000 and HPE e3000 Servers
- Networking
- Netservers
- Secure OS Software for Linux
- Server Management (Insight Manager 7)
- Windows Server 2003
- Operating System - Tru64 Unix
- ProLiant Deployment and Provisioning
- Linux-Based Community / Regional
- Microsoft System Center Integration
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Community
Resources
Forums
Blogs
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-02-2004 07:41 AM
04-02-2004 07:41 AM
file: table is full
I have performed the Q4 analyze of the dump, do you know how to check which process(es) used such large amount of open files?
Thanks in advance.
Simon
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-02-2004 08:15 AM
04-02-2004 08:15 AM
Re: file: table is full
Now you have rebooted it will almost be inpossible to find the culprate. Normaly you could check with sar -v or ps -ef | grep ora | wc -l if the system or oracle is hitting the nfile limit.
nfile at +/- 7500 is not very high. Below is good thread, check the Pete's and Bill's posts.
http://forums1.itrc.hp.com/service/forums/questionanswer.do?threadId=327287
Kind regards and a good weekend,
Robert-Jan
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-02-2004 08:15 AM
04-02-2004 08:15 AM
Re: file: table is full
Now you have rebooted it will almost be impossible to find the culprate. Normaly you could check with sar -v or ps -ef | grep ora | wc -l if the system or oracle is hitting the nfile limit.
nfile at +/- 7500 is not very high. Below is good thread, check the Pete's and Bill's posts.
http://forums1.itrc.hp.com/service/forums/questionanswer.do?threadId=327287
Kind regards and a good weekend,
Robert-Jan
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-02-2004 08:32 AM
04-02-2004 08:32 AM
Re: file: table is full
Depending on your amount of memory, you can raise nfile very high...
Mine is at 100k at the moment on my main Production box. Here is my sar -v output, even on a DEV box...
Showing nproc=16020, ninode=18648, & nfile=2959
# sar -v 1 10
HP-UX hostname B.11.11 U 9000/800 04/02/04
15:27:46 text-sz ov ..proc-sz ......ov ....inod-sz ......ov ....file-sz ov
15:27:47 N/A N/A ....905/16020 0 .....5995/18468 0 .....16395/29539 0
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-02-2004 09:13 AM
04-02-2004 09:13 AM
Re: file: table is full
You could maybe get a report from that historical data.