- Community Home
- >
- Servers and Operating Systems
- >
- Operating Systems
- >
- Operating System - HP-UX
- >
- Re: need a performance/memory guru for problem
Categories
Company
Local Language
Forums
Discussions
Forums
- Data Protection and Retention
- Entry Storage Systems
- Legacy
- Midrange and Enterprise Storage
- Storage Networking
- HPE Nimble Storage
Discussions
Forums
Discussions
Discussions
Forums
Discussions
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
- BladeSystem Infrastructure and Application Solutions
- Appliance Servers
- Alpha Servers
- BackOffice Products
- Internet Products
- HPE 9000 and HPE e3000 Servers
- Networking
- Netservers
- Secure OS Software for Linux
- Server Management (Insight Manager 7)
- Windows Server 2003
- Operating System - Tru64 Unix
- ProLiant Deployment and Provisioning
- Linux-Based Community / Regional
- Microsoft System Center Integration
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Community
Resources
Forums
Blogs
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-06-2007 03:04 AM
04-06-2007 03:04 AM
need a performance/memory guru for problem
I currently have a perl process that gets an "out of memory" error after processing some large files. The heap seems to get to 4 gig and then die. I ran ulimit and show:
$ ulimit -a
time(seconds) unlimited
file(blocks) unlimited
data(kbytes) unlimited
stack(kbytes) 8192
coredump(blocks) unlimited
nofiles(descriptors) 5763
vmemory(kbytes) unlimited
however when I run getrlimit I get:
Maximum data segment size 2147483647
Maximum file size 2147483647
Maximum number of files 1024
8192
Maximum stack size 8388608
Anyone know why getrlimit shows different from ulimit - and if "unlimited" in ulimit is really accurate? (do I need to hard code to a real value - like 5 gig?)
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-06-2007 03:14 AM
04-06-2007 03:14 AM
Re: need a performance/memory guru for problem
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-06-2007 03:14 AM
04-06-2007 03:14 AM
Re: need a performance/memory guru for problem
There is no such thing as unlimited.
What the ulimit settings are saying is there is no limit being imposed on the programs lower than what the system is setting.
You may wish to look at kmtune/kctune settings to see if limitation is being imposed there.
maxdsiz,maxssiz shmmax, etc.
swapinfo -tam
You won't succeed in codeing a 5 GB memory segment unless there is actually that much memory in the box. If you exceed physical memory, you will swap page the system to a near halt.
You say:
however when I run getrlimit I get:
Maximum data segment size 2147483647
Maximum file size 2147483647
Maximum number of files 1024
8192
Maximum stack size 8388608
Those figures seem to correspond to kernel values.
SEP
Owner of ISN Corporation
http://isnamerica.com
http://hpuxconsulting.com
Sponsor: http://hpux.ws
Twitter: http://twitter.com/hpuxlinux
Founder http://newdatacloud.com
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-06-2007 03:17 AM
04-06-2007 03:17 AM
Re: need a performance/memory guru for problem
Your 'maxdsiz' kernel parameter is limiting the size of your data. Increase your 'maxdsiz' (if this is a 32-bit process) or your 'maxdsiz_64-bit' (if this is a 64-bit process).
Regards!
...JRF...
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-06-2007 03:45 AM
04-06-2007 03:45 AM
Re: need a performance/memory guru for problem
>> I currently have a perl process that gets an "out of memory" error after processing some large files.
Are you sure you even get to 4GB, not 2gb?
Is this a 64 bit environment (Itanium perhaps?)
Is the perl version 64 bit ready?
uname -a ?
perl -v ?
Anyway, I would urge you to assume recources are limited at some point and focus on the perl application code.
Can it be optimized?
Should it be perl?
Need an on-disk array instead of in-memory? Need to pre-sort (one of the larger ) data sets such that at least that one does not need to be kept in memory?
Densify the portion of the data to be kept in memory?
Hope this helps some,
Hein van den Heuvel (at gmail dot com)
HvdH Performance Consulting