Operating System - HP-UX
1846858 Members
3640 Online
110256 Solutions
New Discussion

Re: need a performance/memory guru for problem

 
Michael Murphy_2
Frequent Advisor

need a performance/memory guru for problem

Folks,

I currently have a perl process that gets an "out of memory" error after processing some large files. The heap seems to get to 4 gig and then die. I ran ulimit and show:

$ ulimit -a
time(seconds) unlimited
file(blocks) unlimited
data(kbytes) unlimited
stack(kbytes) 8192
coredump(blocks) unlimited
nofiles(descriptors) 5763
vmemory(kbytes) unlimited

however when I run getrlimit I get:

Maximum data segment size 2147483647
Maximum file size 2147483647
Maximum number of files 1024
8192
Maximum stack size 8388608

Anyone know why getrlimit shows different from ulimit - and if "unlimited" in ulimit is really accurate? (do I need to hard code to a real value - like 5 gig?)
4 REPLIES 4
Sandman!
Honored Contributor

Re: need a performance/memory guru for problem

The "unlimited" value for the data segment size is the same as reported by getrlimit. man getrlimit(2) for details. Look at the "/usr/include/sys/resource.h" for definition of RLIM_INFINITY.
Steven E. Protter
Exalted Contributor

Re: need a performance/memory guru for problem

Shalom,

There is no such thing as unlimited.

What the ulimit settings are saying is there is no limit being imposed on the programs lower than what the system is setting.

You may wish to look at kmtune/kctune settings to see if limitation is being imposed there.
maxdsiz,maxssiz shmmax, etc.

swapinfo -tam

You won't succeed in codeing a 5 GB memory segment unless there is actually that much memory in the box. If you exceed physical memory, you will swap page the system to a near halt.


You say:

however when I run getrlimit I get:

Maximum data segment size 2147483647
Maximum file size 2147483647
Maximum number of files 1024
8192
Maximum stack size 8388608

Those figures seem to correspond to kernel values.

SEP
Steven E Protter
Owner of ISN Corporation
http://isnamerica.com
http://hpuxconsulting.com
Sponsor: http://hpux.ws
Twitter: http://twitter.com/hpuxlinux
Founder http://newdatacloud.com
James R. Ferguson
Acclaimed Contributor

Re: need a performance/memory guru for problem

Hi:

Your 'maxdsiz' kernel parameter is limiting the size of your data. Increase your 'maxdsiz' (if this is a 32-bit process) or your 'maxdsiz_64-bit' (if this is a 64-bit process).

Regards!

...JRF...
Hein van den Heuvel
Honored Contributor

Re: need a performance/memory guru for problem

>> Folks,
>> I currently have a perl process that gets an "out of memory" error after processing some large files.

Are you sure you even get to 4GB, not 2gb?
Is this a 64 bit environment (Itanium perhaps?)
Is the perl version 64 bit ready?
uname -a ?
perl -v ?

Anyway, I would urge you to assume recources are limited at some point and focus on the perl application code.
Can it be optimized?
Should it be perl?
Need an on-disk array instead of in-memory? Need to pre-sort (one of the larger ) data sets such that at least that one does not need to be kept in memory?
Densify the portion of the data to be kept in memory?

Hope this helps some,
Hein van den Heuvel (at gmail dot com)
HvdH Performance Consulting