1827294 Members
2057 Online
109717 Solutions
New Discussion

Out of memory error

 
SOLVED
Go to solution
Ninad_1
Honored Contributor

Out of memory error

Hi,

While running a perl script we are getting "Out of memory!" error.
The script is reading a file and populating a hashtable. The file size is around 250 MB. When I monitor the process through glance I can see that it runs till a point of time when it is using around 623 MB and then it exits giving the above mentioned error.
I have read through previous posts for similar error for which the suggestions are to increase the maxdsiz and/or maxdsiz_64.
Following are the maxdsiz,maxssiz and maxtsiz params for my server
maxdsiz 393216 pages
maxdsiz_64bit 327680 pages
maxssiz 98048 pages
maxssiz_64bit 98048 pages
maxtsiz 16384 pages
maxtsiz_64bit 262144 pages
(pagesize=4096 bytes)

ulimit -a gives
time(seconds) unlimited
file(blocks) unlimited
data(kbytes) 1572864
stack(kbytes) 392192
memory(kbytes) unlimited
coredump(blocks) 4194303

As per the parameters the maxdsiz = 1536 MB, maxssiz = 383 MB ( Hence as per some posts since the programs operate in 1st quadrant or something like that the effective memory left for data area is 1536 - 383 = 1153 MB)

Now my question is that why the program exits at around 623 MB giving out of memory error when ideally it can use upto 1153 MB ?

Also pls suggest what should be the value of the parameters.
I am attaching the output of sysdef for other kernel parameters.

Your Help much appreciated.

Thanks,
Ninad
16 REPLIES 16
Steven E. Protter
Exalted Contributor

Re: Out of memory error

Shalom Ninad,

perhaps it would be possible to use the tusc utility to run this program. Then you can get a look at what the system says is happening right when it terminates.

If there is a kernel related issue, it probably has to do with memory regions but I can't work the math out to figure out why its failing when it fails.-

SEP
Steven E Protter
Owner of ISN Corporation
http://isnamerica.com
http://hpuxconsulting.com
Sponsor: http://hpux.ws
Twitter: http://twitter.com/hpuxlinux
Founder http://newdatacloud.com
Eric Antunes
Honored Contributor

Re: Out of memory error

Hi Ninad,

First of all, in HP-UX 11.0 those parameter are in Bytes, not pages. I don't know what is your OS version but see in SAM the Description of each of them for a better understanding.

Here are my values as an example:

|| maxdsiz 536870912 536870912 Static N/A M ^ |
|| maxdsiz_64bit 1073741824 1073741824 Static N/A M |
|| maxfiles 1024 1024 Static N/A S |
|| maxfiles_lim 1024 1024 Static N/A H |
|| maxssiz 134217728 134217728 Static N/A M |
|| maxssiz_64bit 1073741824 1073741824 Static N/A M |
|| maxswapchunks 16384 16384 Static N/A M |
|| maxtsiz 134217728 134217728 Static N/A M |
|| maxtsiz_64bit

Best Regards,

Eric Antunes
Each and every day is a good day to learn.
Ninad_1
Honored Contributor

Re: Out of memory error

SEP,

I do not have tusc utility installed on my server. Can you suggest any other method ? How can I sort out this problem ? Please HELP
Also can you let me know if my understanding of the parameters correct especially the 1st quadrant and maxdsiz - maxssiz bit ?

Eric,

You are absolutely right. The parameters from SAM for the maxdsiz ... parameters is in bytes , but the parameters I have provided in the post are from the output of the sysdef command which is giving figures in pages.

Thanks,
Ninad
Arunvijai_4
Honored Contributor

Re: Out of memory error

Hi Ninad,

You can get tusc from the following location,
http://hpux.connect.org.uk/hppd/hpux/Sysadmin/tusc-7.8/

-Arun
"A ship in the harbor is safe, but that is not what ships are built for"
Ninad_1
Honored Contributor

Re: Out of memory error

Arun,

Thanks for the link. Coincidently I was just having a look at that link after searching from google. But binary depot for 11i onwards is only available , whereas my server is running on 11.00. Can you suggest any other alternative ? Any patches need to be installed ?

Thanks,
Ninad
Steven E. Protter
Exalted Contributor

Re: Out of memory error

Shalom again Ninad,

I'd suggest downloading and compiling the perl source code from http://www.perl.org or http://www.cpan.org

Also, if you contact your HP rep or open a support call, I'd say HP has those perl for hpux 11.00 binaries laying around.

I nearly certain we have them but its probably a violation of our support agreement to distribute these files.

The tarball here might work.
http://hpux.connect.org.uk/hppd/hpux/Languages/perl-5.8.7/

SEP
Steven E Protter
Owner of ISN Corporation
http://isnamerica.com
http://hpuxconsulting.com
Sponsor: http://hpux.ws
Twitter: http://twitter.com/hpuxlinux
Founder http://newdatacloud.com
Arunvijai_4
Honored Contributor

Re: Out of memory error

Hi Ninad,

You can install 11.11 tusc depot in 11.0, it will work well. Also, have a look at this page to get to know tusc,

http://h21007.www2.hp.com/dspp/tech/tech_TechDocumentDetailPage_IDX/1,1701,2894,00.html

-Arun
"A ship in the harbor is safe, but that is not what ships are built for"
Ninad_1
Honored Contributor

Re: Out of memory error

Hi,

I managed to get a depot for 11.00 for perl 5.6.1 , which I installed and ran the script again but still I am getting error,only that the error now I am getting is more elaborate - hope it makes sense to you guys.
Please can you advice on how I can proceed now ?

The error is "Out of memory during request for 212 bytes, total sbrk() is 668161892 bytes!"

Regs,
Ninad
A. Clay Stephenson
Acclaimed Contributor

Re: Out of memory error

Unless your Perl executable was compiled and linked with special options, the maximum data segment size limit for the 32-bit process is 1GB even if maxdsiz is larger. Because both the data and stack segments are allocated from the same quadrant, your actual solution is to reduce maxssiz to something like 64MB (and that is generous). Your ~383MB maxssiz reduces the memory available for dynamic memory allocation even if the stack never actually even comes close to that amount.
If it ain't broke, I can fix that.
Ninad_1
Honored Contributor

Re: Out of memory error

Clay,

I did a small experiment to understand the parameters and the problem.
The kern params are
maxdsiz 1610612736 bytes / 1572864 kbytes
maxssiz 401604608 bytes / 392192 kbytes

ulimit -a gives
data 1572864 kbytes
stack 392192 kbytes

Then I ran the perl script for combination of data and stack sizes using ulimit -d and ulimit -s to set the data and stack sizes.
Following are the results

-----------------------------------------
| data(bytes)|stack(bytes)| error(bytes)|
-----------------------------------------
| 1610612736 | 401604608 | 668161892 |
-----------------------------------------
| 1073741824 | 401604608 | 668161892 |
-----------------------------------------
| 524288000 | 401604608 | 512081764 |
-----------------------------------------
| 524288000 | 104857600 | 512081764 |
-----------------------------------------
| 209715200 | 104857600 | 208076644 |
-----------------------------------------
| 104857600 | 104857600 | 104165220 |
-----------------------------------------
| 52428800 | 104857600 | 52201316 |
-----------------------------------------
| 1610612736 | 67108864 | 668161892 |
-----------------------------------------

From this I am unable to understand whatever I have read in the earlier posts and what you have stated regarding the use of 1st quadrant , and actual usable space for data = mazdsiz - maxssiz.

If you see the figures the error comes at approx the same bytes the data seg size has been set by ulimit irrespective of the stack size. Only this is not coming true - there seems to be "something" when error comes at around 668161892 bytes irrespective of the data size or the (data size - stack size ). Please see the last line of the table where the data size is around 1536 MB and stack size is around 64 MB as you suggested but still error comes at 637 MB.

Please can you throw some more light.

Regs,
Ninad
A. Clay Stephenson
Acclaimed Contributor

Re: Out of memory error

First run chatr on your Perl executable and post the results. That will tell me what kind of executable you have.

In the default SHARED_MAGIC 32-bit world, its not quite true that the stack always subtracts from dynamically allocated data. The limit is 1GB (actualy ~960MB because of some other address mapping) for the total but since the the stack and heap are allocated from each end of the 1GB quadrant they don't necessarily collide.

If it ain't broke, I can fix that.
Ninad_1
Honored Contributor

Re: Out of memory error

Clay,

Please find the output of chatr for perl

chatr /usr/local/bin/perl5.6.1
/usr/local/bin/perl5.6.1:
shared executable
shared library dynamic path search:
SHLIB_PATH enabled first
embedded path disabled second Not Defined
shared library list:
dynamic /usr/lib/libcl.2
dynamic /usr/lib/libpthread.1
dynamic /usr/lib/libnsl.1
dynamic /usr/lib/libnm.sl
dynamic /usr/lib/libdld.2
dynamic /usr/lib/libm.2
dynamic /usr/lib/libc.2
dynamic /usr/lib/libsec.2
dynamic /usr/local/lib/libiconv.sl
shared library binding:
deferred
global hash table disabled
plabel caching disabled
global hash array size:1103
global hash array nbuckets:3
shared vtable support disabled
static branch prediction disabled
executable from stack: D (default)
kernel assisted branch prediction enabled
lazy swap allocation disabled
text segment locking disabled
data segment locking disabled
third quadrant private data space disabled
fourth quadrant private data space disabled
third quadrant global data space disabled
data page size: D (default)
instruction page size: D (default)
nulptr references disabled
shared library private mapping disabled
shared library text merging disabled


But still can you explain why upto 512 MB the error is thrown at approx same size as allocated to data whereas somewhere above 512 MB the error is thrown at around 637 MB whatever the value of data. I am desperate to understand this.

Thanks for your guidance,
Ninad
A. Clay Stephenson
Acclaimed Contributor
Solution

Re: Out of memory error

Chatr indicates that yours is a 32-bit shared executable w/o additional quadrants used for the data segment. This means that ALL static data, stack, and heap (dynamically allocated memory) are confined to the 2nd quadrant. Thus the total of these cannot exceed 1GB --- regardless of the setting of maxdsiz above 1GB. I don't know how you are actually measuring the amount of memory allocated in Perl but I am very suspicious. Also, I haven't bothered to look through the Perl source so I don't know if your ulimit (soft) values are not being overridden by the Perl internally. In short, I don't trust your data.

Instead let's use a known quantity:
-------------------------------------
#include
#include

#define CHUNK (1024 * 1024)

#define assign_errno(x) ((errno != 0) ? errno : (x))

extern int errno;

int main()
{
int cc = 0;
char *p = NULL;
long tot = 0L;

do
{
p = (char *) malloc((size_t) CHUNK);
if (p != NULL) tot += CHUNK;
else cc = assign_errno(-1);
}
while (p != NULL);
(void) printf("Tot: %10ld Status: %d\n",tot,cc);
return(cc);
}
--------------------------------------
Compile it like this:
cc memtest.c +DD32 -o memtest
and then chatr memtest to make sure it matches your Perl. There will probably be a difference in SHLIB_PATH and certainly in the shared libraries but these differences are not significant. The code should compile with even the Bundled C compiler.

This guy allocates 1MB of memory at a time until a failure occurs. Your settings of maxdsiz and maxssiz should make more sense with this program.

In any event, it is very dumb to have your stack as large as you have it. Only extremely poorly written code ever needs a stack bigger than 128MB (and this would apply to 64-bit code as well) and the vast majority of code will run happily at 32MB. If I ever need more than 64MB for a stack I have a very serious talk with the developer.

If it ain't broke, I can fix that.
Ninad_1
Honored Contributor

Re: Out of memory error

Clay,

You have given me a fantastic program.
I ran it for several settings for data and stack using ulimit again. Following is the conclusion for others who wish to understand how maxssiz and maxdsiz practically work.
BUT before that one last question
I used chatr +q3p memtest to allow memtest to use 3rd quadrant and it worked fine - i.e. I was able to use more than 1 GB memory for data segment.
Similarly I used chatr for perl to allow using 3rd quadrant - but is this proper and safe method ? Will it cause any problems to my server ? and will this setting be permenant - will it survive across reboots ?
Clay can you answer this last one for me please ?


Heres the conclusion of maxdsiz and maxssiz.
As Clay has been rightly saying 32 bit executables use 1st quadrant - only thing is he said the space in 1st quadrant will be lesser than 1 GB - I found out that it is 1023 MB (i.e. 1 Mb less than 1 GB).
When we define the maxdsiz and maxssiz - this defines the max stack size equal to value of maxssiz and max data seg size to (1 GB - 1 MB - maxssiz) or (maxdsiz - 1 MB) which ever is lesser.
To explain with an example (These values may not be practically correct to be defined - as in my case maxssiz is 383 MB which is far too high than practically required)
1. if maxssiz is set to 383 MB and maxdsiz to 800 MB - we will get max stack size = 383 MB BUT the max data size = 1GB - 1MB - maxssiz = 640 MB (even though maxdsiz is set to 800 MB)
2. if maxssiz is set to 64 MB and maxdsiz to 800 MB - we will get max stack size=64 Mb and max data size = maxdsiz - 1 MB = 799 MB

We can use ulimit to define data or stack size for the environment you are working in - we can set data size max upto the value defined for maxdsiz and can set max stack size upto the value defined for maxssiz. But again practically the max stack size available will be equal to the value defined by ulimit (as is logical) but the max data size available will be equal to (1GB - 1MB - maxssiz ) and not (1GB - 1 MB - max stack size set by ulimit)

Thanks everyone for their help(especially Clay - u r great).

Ninad
A. Clay Stephenson
Acclaimed Contributor

Re: Out of memory error

Yes, your chatr command actually modifies the executable so the changes are preserved across reboots. Also, note that I said that DATA for shared executables is confined to the 2ND Quadrant not the 1ST. The 1st quadrant is used for the program's text (instructions). The +q3p enable chatr option extends the data into Q3 as well but at a cost. This reduces the space normally available for shared library code, shared memory-mapped files and possibly shared memory. Your perl executable should be fine with this chatr option.
If it ain't broke, I can fix that.
Bill Hassell
Honored Contributor

Re: Out of memory error

To add a bit to Clay's program, I've attached a similar program that has instructions on how to make the program grab 900, 1700, 2700 and 3700 megs when compiled in 32bit mode, and essentially unlimited RAM in 64bit mode (I quit testing when the program grabbed 47Gb on a 2Gb D-class box). I will also strongly agree about the stack size. There is no good reason to have a program stack that exceeds 20-40 megs. The vast majority of programs use less than 10 megs.


Bill Hassell, sysadmin