- Community Home
- >
- Servers and Operating Systems
- >
- Operating Systems
- >
- Operating System - HP-UX
- >
- Out of memory error
Categories
Company
Local Language
Forums
Discussions
Forums
- Data Protection and Retention
- Entry Storage Systems
- Legacy
- Midrange and Enterprise Storage
- Storage Networking
- HPE Nimble Storage
Discussions
Forums
Discussions
Discussions
Discussions
Forums
Discussions
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
- BladeSystem Infrastructure and Application Solutions
- Appliance Servers
- Alpha Servers
- BackOffice Products
- Internet Products
- HPE 9000 and HPE e3000 Servers
- Networking
- Netservers
- Secure OS Software for Linux
- Server Management (Insight Manager 7)
- Windows Server 2003
- Operating System - Tru64 Unix
- ProLiant Deployment and Provisioning
- Linux-Based Community / Regional
- Microsoft System Center Integration
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Community
Resources
Forums
Blogs
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-14-2005 09:02 PM
12-14-2005 09:02 PM
While running a perl script we are getting "Out of memory!" error.
The script is reading a file and populating a hashtable. The file size is around 250 MB. When I monitor the process through glance I can see that it runs till a point of time when it is using around 623 MB and then it exits giving the above mentioned error.
I have read through previous posts for similar error for which the suggestions are to increase the maxdsiz and/or maxdsiz_64.
Following are the maxdsiz,maxssiz and maxtsiz params for my server
maxdsiz 393216 pages
maxdsiz_64bit 327680 pages
maxssiz 98048 pages
maxssiz_64bit 98048 pages
maxtsiz 16384 pages
maxtsiz_64bit 262144 pages
(pagesize=4096 bytes)
ulimit -a gives
time(seconds) unlimited
file(blocks) unlimited
data(kbytes) 1572864
stack(kbytes) 392192
memory(kbytes) unlimited
coredump(blocks) 4194303
As per the parameters the maxdsiz = 1536 MB, maxssiz = 383 MB ( Hence as per some posts since the programs operate in 1st quadrant or something like that the effective memory left for data area is 1536 - 383 = 1153 MB)
Now my question is that why the program exits at around 623 MB giving out of memory error when ideally it can use upto 1153 MB ?
Also pls suggest what should be the value of the parameters.
I am attaching the output of sysdef for other kernel parameters.
Your Help much appreciated.
Thanks,
Ninad
Solved! Go to Solution.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-14-2005 09:22 PM
12-14-2005 09:22 PM
Re: Out of memory error
perhaps it would be possible to use the tusc utility to run this program. Then you can get a look at what the system says is happening right when it terminates.
If there is a kernel related issue, it probably has to do with memory regions but I can't work the math out to figure out why its failing when it fails.-
SEP
Owner of ISN Corporation
http://isnamerica.com
http://hpuxconsulting.com
Sponsor: http://hpux.ws
Twitter: http://twitter.com/hpuxlinux
Founder http://newdatacloud.com
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-14-2005 09:27 PM
12-14-2005 09:27 PM
Re: Out of memory error
First of all, in HP-UX 11.0 those parameter are in Bytes, not pages. I don't know what is your OS version but see in SAM the Description of each of them for a better understanding.
Here are my values as an example:
|| maxdsiz 536870912 536870912 Static N/A M ^ |
|| maxdsiz_64bit 1073741824 1073741824 Static N/A M |
|| maxfiles 1024 1024 Static N/A S |
|| maxfiles_lim 1024 1024 Static N/A H |
|| maxssiz 134217728 134217728 Static N/A M |
|| maxssiz_64bit 1073741824 1073741824 Static N/A M |
|| maxswapchunks 16384 16384 Static N/A M |
|| maxtsiz 134217728 134217728 Static N/A M |
|| maxtsiz_64bit
Best Regards,
Eric Antunes
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-14-2005 10:02 PM
12-14-2005 10:02 PM
Re: Out of memory error
I do not have tusc utility installed on my server. Can you suggest any other method ? How can I sort out this problem ? Please HELP
Also can you let me know if my understanding of the parameters correct especially the 1st quadrant and maxdsiz - maxssiz bit ?
Eric,
You are absolutely right. The parameters from SAM for the maxdsiz ... parameters is in bytes , but the parameters I have provided in the post are from the output of the sysdef command which is giving figures in pages.
Thanks,
Ninad
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-14-2005 10:06 PM
12-14-2005 10:06 PM
Re: Out of memory error
You can get tusc from the following location,
http://hpux.connect.org.uk/hppd/hpux/Sysadmin/tusc-7.8/
-Arun
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-14-2005 10:16 PM
12-14-2005 10:16 PM
Re: Out of memory error
Thanks for the link. Coincidently I was just having a look at that link after searching from google. But binary depot for 11i onwards is only available , whereas my server is running on 11.00. Can you suggest any other alternative ? Any patches need to be installed ?
Thanks,
Ninad
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-14-2005 10:21 PM
12-14-2005 10:21 PM
Re: Out of memory error
I'd suggest downloading and compiling the perl source code from http://www.perl.org or http://www.cpan.org
Also, if you contact your HP rep or open a support call, I'd say HP has those perl for hpux 11.00 binaries laying around.
I nearly certain we have them but its probably a violation of our support agreement to distribute these files.
The tarball here might work.
http://hpux.connect.org.uk/hppd/hpux/Languages/perl-5.8.7/
SEP
Owner of ISN Corporation
http://isnamerica.com
http://hpuxconsulting.com
Sponsor: http://hpux.ws
Twitter: http://twitter.com/hpuxlinux
Founder http://newdatacloud.com
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-14-2005 10:22 PM
12-14-2005 10:22 PM
Re: Out of memory error
You can install 11.11 tusc depot in 11.0, it will work well. Also, have a look at this page to get to know tusc,
http://h21007.www2.hp.com/dspp/tech/tech_TechDocumentDetailPage_IDX/1,1701,2894,00.html
-Arun
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-15-2005 01:06 AM
12-15-2005 01:06 AM
Re: Out of memory error
I managed to get a depot for 11.00 for perl 5.6.1 , which I installed and ran the script again but still I am getting error,only that the error now I am getting is more elaborate - hope it makes sense to you guys.
Please can you advice on how I can proceed now ?
The error is "Out of memory during request for 212 bytes, total sbrk() is 668161892 bytes!"
Regs,
Ninad
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-15-2005 02:08 AM
12-15-2005 02:08 AM
Re: Out of memory error
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-15-2005 03:11 AM
12-15-2005 03:11 AM
Re: Out of memory error
I did a small experiment to understand the parameters and the problem.
The kern params are
maxdsiz 1610612736 bytes / 1572864 kbytes
maxssiz 401604608 bytes / 392192 kbytes
ulimit -a gives
data 1572864 kbytes
stack 392192 kbytes
Then I ran the perl script for combination of data and stack sizes using ulimit -d and ulimit -s to set the data and stack sizes.
Following are the results
-----------------------------------------
| data(bytes)|stack(bytes)| error(bytes)|
-----------------------------------------
| 1610612736 | 401604608 | 668161892 |
-----------------------------------------
| 1073741824 | 401604608 | 668161892 |
-----------------------------------------
| 524288000 | 401604608 | 512081764 |
-----------------------------------------
| 524288000 | 104857600 | 512081764 |
-----------------------------------------
| 209715200 | 104857600 | 208076644 |
-----------------------------------------
| 104857600 | 104857600 | 104165220 |
-----------------------------------------
| 52428800 | 104857600 | 52201316 |
-----------------------------------------
| 1610612736 | 67108864 | 668161892 |
-----------------------------------------
From this I am unable to understand whatever I have read in the earlier posts and what you have stated regarding the use of 1st quadrant , and actual usable space for data = mazdsiz - maxssiz.
If you see the figures the error comes at approx the same bytes the data seg size has been set by ulimit irrespective of the stack size. Only this is not coming true - there seems to be "something" when error comes at around 668161892 bytes irrespective of the data size or the (data size - stack size ). Please see the last line of the table where the data size is around 1536 MB and stack size is around 64 MB as you suggested but still error comes at 637 MB.
Please can you throw some more light.
Regs,
Ninad
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-15-2005 03:41 AM
12-15-2005 03:41 AM
Re: Out of memory error
In the default SHARED_MAGIC 32-bit world, its not quite true that the stack always subtracts from dynamically allocated data. The limit is 1GB (actualy ~960MB because of some other address mapping) for the total but since the the stack and heap are allocated from each end of the 1GB quadrant they don't necessarily collide.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-15-2005 03:53 AM
12-15-2005 03:53 AM
Re: Out of memory error
Please find the output of chatr for perl
chatr /usr/local/bin/perl5.6.1
/usr/local/bin/perl5.6.1:
shared executable
shared library dynamic path search:
SHLIB_PATH enabled first
embedded path disabled second Not Defined
shared library list:
dynamic /usr/lib/libcl.2
dynamic /usr/lib/libpthread.1
dynamic /usr/lib/libnsl.1
dynamic /usr/lib/libnm.sl
dynamic /usr/lib/libdld.2
dynamic /usr/lib/libm.2
dynamic /usr/lib/libc.2
dynamic /usr/lib/libsec.2
dynamic /usr/local/lib/libiconv.sl
shared library binding:
deferred
global hash table disabled
plabel caching disabled
global hash array size:1103
global hash array nbuckets:3
shared vtable support disabled
static branch prediction disabled
executable from stack: D (default)
kernel assisted branch prediction enabled
lazy swap allocation disabled
text segment locking disabled
data segment locking disabled
third quadrant private data space disabled
fourth quadrant private data space disabled
third quadrant global data space disabled
data page size: D (default)
instruction page size: D (default)
nulptr references disabled
shared library private mapping disabled
shared library text merging disabled
But still can you explain why upto 512 MB the error is thrown at approx same size as allocated to data whereas somewhere above 512 MB the error is thrown at around 637 MB whatever the value of data. I am desperate to understand this.
Thanks for your guidance,
Ninad
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-15-2005 06:21 AM
12-15-2005 06:21 AM
SolutionInstead let's use a known quantity:
-------------------------------------
#include
#include
#define CHUNK (1024 * 1024)
#define assign_errno(x) ((errno != 0) ? errno : (x))
extern int errno;
int main()
{
int cc = 0;
char *p = NULL;
long tot = 0L;
do
{
p = (char *) malloc((size_t) CHUNK);
if (p != NULL) tot += CHUNK;
else cc = assign_errno(-1);
}
while (p != NULL);
(void) printf("Tot: %10ld Status: %d\n",tot,cc);
return(cc);
}
--------------------------------------
Compile it like this:
cc memtest.c +DD32 -o memtest
and then chatr memtest to make sure it matches your Perl. There will probably be a difference in SHLIB_PATH and certainly in the shared libraries but these differences are not significant. The code should compile with even the Bundled C compiler.
This guy allocates 1MB of memory at a time until a failure occurs. Your settings of maxdsiz and maxssiz should make more sense with this program.
In any event, it is very dumb to have your stack as large as you have it. Only extremely poorly written code ever needs a stack bigger than 128MB (and this would apply to 64-bit code as well) and the vast majority of code will run happily at 32MB. If I ever need more than 64MB for a stack I have a very serious talk with the developer.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-15-2005 09:34 AM
12-15-2005 09:34 AM
Re: Out of memory error
You have given me a fantastic program.
I ran it for several settings for data and stack using ulimit again. Following is the conclusion for others who wish to understand how maxssiz and maxdsiz practically work.
BUT before that one last question
I used chatr +q3p memtest to allow memtest to use 3rd quadrant and it worked fine - i.e. I was able to use more than 1 GB memory for data segment.
Similarly I used chatr for perl to allow using 3rd quadrant - but is this proper and safe method ? Will it cause any problems to my server ? and will this setting be permenant - will it survive across reboots ?
Clay can you answer this last one for me please ?
Heres the conclusion of maxdsiz and maxssiz.
As Clay has been rightly saying 32 bit executables use 1st quadrant - only thing is he said the space in 1st quadrant will be lesser than 1 GB - I found out that it is 1023 MB (i.e. 1 Mb less than 1 GB).
When we define the maxdsiz and maxssiz - this defines the max stack size equal to value of maxssiz and max data seg size to (1 GB - 1 MB - maxssiz) or (maxdsiz - 1 MB) which ever is lesser.
To explain with an example (These values may not be practically correct to be defined - as in my case maxssiz is 383 MB which is far too high than practically required)
1. if maxssiz is set to 383 MB and maxdsiz to 800 MB - we will get max stack size = 383 MB BUT the max data size = 1GB - 1MB - maxssiz = 640 MB (even though maxdsiz is set to 800 MB)
2. if maxssiz is set to 64 MB and maxdsiz to 800 MB - we will get max stack size=64 Mb and max data size = maxdsiz - 1 MB = 799 MB
We can use ulimit to define data or stack size for the environment you are working in - we can set data size max upto the value defined for maxdsiz and can set max stack size upto the value defined for maxssiz. But again practically the max stack size available will be equal to the value defined by ulimit (as is logical) but the max data size available will be equal to (1GB - 1MB - maxssiz ) and not (1GB - 1 MB - max stack size set by ulimit)
Thanks everyone for their help(especially Clay - u r great).
Ninad
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-15-2005 09:47 AM
12-15-2005 09:47 AM
Re: Out of memory error
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-15-2005 11:47 AM
12-15-2005 11:47 AM
Re: Out of memory error
Bill Hassell, sysadmin