Hello,
ag**********@gm ail.com wrote:
cases? Do let me know if this a just compiler issue and not related to
C++ per se.
This is just a compiler issue, i.e. a quality of implementation issue.
E.g. gcc-4.1.1 compiles the code to around 7200 bytes in both cases.
The problem must have been known.
There are many ways for a compiler to establish the initialization. The
most important rule for the compiler is the as-if rule. The compiler
may do what it wants as long the code behaves like as if nothing
special had been done.
The most trivial one is doing it at compile time by putting all the data
as required into the executable into the data segment. This uses lots
of space in the executable but no extra code and no extra execution
time.
In the case of initialization to zero, it is possible to use the BSS
segment, which means that the memory is not used up in the executable,
but at execution time. The memory is taken from the OS initialized to
all bits zero, at least for most UNIX systems. This won't work when
initializing to something different from zero.
The last alternative for non zero-initialized data is producing code to
initialize the arrays at runtime using space allocated in the BSS. This
is what gcc-4.1.1 does when looking at the resulting assembler code.
There are even more opportunities if you think about combinations.
For arrays of that size, it is clear that allocating memory in the
executable would increase loading time a lot more then including a
little more code to do the initialization.
It is obvious, that a compiler has to decide by the size of the array,
the types involved, and the optimization objective (space or time) to
find the best among the possibilities. Most of the work done for
gcc-4.1.1 involves improvements in optimization. This is one field
where improvements were possible.
Bernd Strieder