On Dec 18, 5:05 pm, "Victor Bazarov" <v.Abaza...@com Acast.netwrote:
D. Susman wrote:
I know that this issue is indeed strictly operating system dependent
but I am just curious:
I have a five dimensional array, whose size sums to 68 MB (almost).
This array is contained by class X. When I simply attempt to create an
instance of that class, I get segmentation fault.
What does "simply attempt to create an instance" mean? An automatic
object? Have you tried creating it dynamically?
Or statically? Or throwing it as an exception:-)?
I am working on Solaris (which has 8 KB pages ). What may the case be?
If that's an automatic object you're trying to create, your stack is
not big enough to contain it. Try increasing the stack size (usually
it is a linker option, RTFM to learn which one it's on your system).
Usually, it's a simple command which you execute before invoking
the program. Under Solaris, "ulimit -s unlimited" should work.
(With ulimit -s set to unlimited, you can get as much stack as
you could get with dynamic allocation.)
Also, consider that putting a 68 MB stress on your stack may
not be a good idea.
Doing it once shouldn't cause any problems on a modern machine
(32 bits or more). I wouldn't suggest doing it in a lot of
different functions, which might end up calling each other, but
if there's only one instance for the life of the program,
declaring it in main is probably all right. (With a data
structure of this size, he's not going to be porting to a 16 bit
machine anyway.)
There seems to be an upper limit for the size of an array.
... but I am not certain you've reached it yet.
The formal upper limit is numeric_limits< size_t>::max(). Under
Solaris, either 4E09 or 1.8E19, depending on compiler options.
In practice, of course, the program will have to allocate memory
for it at some time, and that's likely to fail long before you
reach the formal limit (especially in the second case).
There are other, more severe limitations on your program, and
they can be at play here.
Yes. Note that even if the system accepts the allocation, if
you start accessing it randomly, cache misses will slow the
program down considerably. For even larger allocations, where
there isn't enough physical memory, page faults can slow the
system down to the point of becoming unusable. (When I test our
application here under Purify---which increases the memory
footprint significantly---, I have to terminate every other
application on the system. Otherwise, X gets paged out, and the
disk starts thrashing every time I move the cursor.)
--
James Kanze (GABI Software) email:ja******* **@gmail.com
Conseils en informatique orientée objet/
Beratung in objektorientier ter Datenverarbeitu ng
9 place Sémard, 78210 St.-Cyr-l'École, France, +33 (0)1 30 23 00 34