I am having a problem with a program that allocates very large amounts
of memory (approaching 2Gb total) in small chunks, e.g., a few Mb at
a time.
The program is dumping core because it consistently gets a SIGABRT
while it is in a call to new to allocate memory. I tracked down
some old Usenet postings that discussed how new's standard behavior
is to throw an exception that ultimately results in a SIGABRT whenever
it cannot allocate the requested memory. I suspect that my problem is
that I am running up against a system limit on per-process memory
allocation, but I would like to verify this if possible. Does anyone
know how to do this? Does new ever trigger a SIGABRT for any reason
other than the case where it cannot satisfy a request because there
is not enough memory available? I have checked the arguments being
passed to the problematic new call in a debugger, and they are
reasonable, i.e., pretty much the same as the preceding call to new
which succeeded. Does C++ have any special limitations on memory
allocation other than those imposed by the operating system?
Has new always triggered a SIGABRT for this reason? My original
Stroustrup reference says that by default new returns a null pointer
if it cannot satisfy a request, and says nothing about SIGABRT. Is the
latter behavior now standard, and when did it become so?
Thanks very much!
--
Roger Davis
University of Hawaii/SOEST
rb*@NoSpamHere.hawaii.edu