- Community Home
- >
- Servers and Operating Systems
- >
- Operating Systems
- >
- Operating System - HP-UX
- >
- Out of memory?
Categories
Company
Local Language
Forums
Discussions
Forums
- Data Protection and Retention
- Entry Storage Systems
- Legacy
- Midrange and Enterprise Storage
- Storage Networking
- HPE Nimble Storage
Discussions
Forums
Discussions
Discussions
Discussions
Forums
Discussions
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
- BladeSystem Infrastructure and Application Solutions
- Appliance Servers
- Alpha Servers
- BackOffice Products
- Internet Products
- HPE 9000 and HPE e3000 Servers
- Networking
- Netservers
- Secure OS Software for Linux
- Server Management (Insight Manager 7)
- Windows Server 2003
- Operating System - Tru64 Unix
- ProLiant Deployment and Provisioning
- Linux-Based Community / Regional
- Microsoft System Center Integration
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Community
Resources
Forums
Blogs
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-21-2007 11:46 PM
11-21-2007 11:46 PM
I was wondering if someone can help me, I'm working on an 11.23 PA HP-UX (64 x 1ghz CPU, 164gb Memory) server and a customer has come to me with the following fault "An error is being thrown from the perl script. I think the perl process is is unable to hold 3M customers in the perl hash and is quitting with out of memory"
Is there any kernel param or such that would limit the amount of memory the perl process could have any help or ideas greatly appreciated?
I looked at ulimit for the user and memory was set to unlimited, and according to glance the servers physical memory wasn't a problem. Couldn't think of anywhere else to look.
Many thanks, Steve.
Solved! Go to Solution.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-21-2007 11:59 PM
11-21-2007 11:59 PM
Re: Out of memory?
maxssiz_64bit
maxtsiz_64bit
maxdsiz_64bit
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-22-2007 12:25 AM
11-22-2007 12:25 AM
Re: Out of memory?
The parameteres you mention seem to make sense...
maxtsiz specifies the maximum shared-text segment size in bytes.
maxtsiz specifies the maximum shared-text segment size in bytes
maxdsiz_64bit specify the maximum data segment size, in bytes, for an executing process.
These are currently set on this system to:-
maxdsiz_64bit 2147483647 (2gb?)
maxssiz_64bit 1073741824 (1gb?)
maxtsiz_64bit 1073741824 (1gb?)
They are all dynamic, and I guess they would all need increasing at the same time?
Cheers, Steve.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-22-2007 04:22 AM
11-22-2007 04:22 AM
Solution>maxdsiz_64bit 2147483647 (2gb?)
>I guess they would all need increasing at the same time?
Only maxdsiz_64bit needs to be increased.
Also is your perl a 64 bit application? If not, you need to also increase maxdsiz.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-22-2007 04:36 AM
11-22-2007 04:36 AM
Re: Out of memory?
Check for: use64bitall
Is is defined or undef?
Also dealing with , 3 M records may warrant you to rethink the algoritme used and its details such as hash-key-size and amount-of-data-saved per key. How big are the keys and values on average? Round up to say 32 bytes boundary (just guessing) and multiply by 3,000,000. How much data are we talking about?
Can you alter the source to pause every 500,000 records loaded and check the memory used by the process?
You may want to use a work file, or hash on a file, or make the hash be lseek offsets back onto the original data intead of the records itself.
Hope this helps some,
Hein van den Heuvel (at gmail dot com)
HvdH Performance Consulting
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-22-2007 12:18 PM
11-22-2007 12:18 PM
Re: Out of memory?
Bill Hassell, sysadmin
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-22-2007 09:39 PM
11-22-2007 09:39 PM
Re: Out of memory?
Thanks for your replies.
I checked 'Perl -V' and use64bitall is undef, which I assume means that maxdsiz is the key kernel param now. This is currently set to 1073741824 (1gb if my maths is correct!).
Thanks for all the specific Perl stuff, unfortunately I have no experience of Perl at all, I will pass this on to the customer who has reported the issue and see if it is of any help.
Many thanks again. Steve.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-22-2007 09:45 PM
11-22-2007 09:45 PM
Re: Out of memory?
Yes, you can increase it to 4 Gb but you probably can't use more than 3, if you use chatr +q3p enable +q4p enable.
To get the last 1 Gb you must link perl with -N.