By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
454,478 Members | 1,703 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 454,478 IT Pros & Developers. It's quick & easy.

Compile time initialization of data.

P: n/a

typedef unsigned long long int uint64
typedef unsigned char uint8;

Class Simple
{
union { uint64 x; uint8 r[8]; }

public:
Simple(uint64 n) : x(n) {;}
//....
};
Class Simple_user
{
static const Simple simple_array[8];

public:
//....
};

const Simple Simple_user::simple_array[8] =
{ 0x01, 0x02, 0x03, 0x04, 0x05, 0x06, 0x07, 0x08 };
Will the array of simples always be constructed at compile time?
Can an optimizing compiler construct the array at compile time?

Thanks for any answers/comments.

--
Regards,
S.K.Mody
Aug 3 '05 #1
Share this Question
Share on Google+
7 Replies


P: n/a
S.K.Mody wrote:
typedef unsigned long long int uint64
Not valid C++ (yet). There is no 'long long' type in C++. You
should probably have used 'double'.
typedef unsigned char uint8;

Class Simple
class Simple

, maybe?
{
union { uint64 x; uint8 r[8]; }

public:
Simple(uint64 n) : x(n) {;}
//....
};
Class Simple_user
class Simple_user

, maybe?
{
static const Simple simple_array[8];

public:
//....
};

const Simple Simple_user::simple_array[8] =
{ 0x01, 0x02, 0x03, 0x04, 0x05, 0x06, 0x07, 0x08 };
Will the array of simples always be constructed at compile time?
No. It's unspecified (8.5.1/14).
Can an optimizing compiler construct the array at compile time?


It may. Whether a particular one _can_ or not, depends on its
implementors, doesn't it?

V
Aug 3 '05 #2

P: n/a
Victor Bazarov wrote:
S.K.Mody wrote:
typedef unsigned long long int uint64
Not valid C++ (yet). There is no 'long long' type in C++. You
should probably have used 'double'.


In the actual code, I used the C99 header provided with
glibc (stdint.h) which has nice fixed width types like
uint16_t, uint64_t, int_fast8_t etc, which may make it to C++
eventually. I'll have to use some preprocessor conditionals or
roll my own uint64 if I decide to compile it for other platforms.


Will the array of simples always be constructed at compile time?


No. It's unspecified (8.5.1/14).
Can an optimizing compiler construct the array at compile time?


It may. Whether a particular one _can_ or not, depends on its
implementors, doesn't it?


True, but I am hoping that there is some agreement among compiler
writers as to what constitutes an acceptable level of optimization.
After all, there is technically no requirement for example that
a compiler inline any methods at all, but for some types of code
that would prove to be unacceptable.

I'm using g++ on x86 Linux.
V


--
Regards,
S.K.Mody
Aug 4 '05 #3

P: n/a
S.K.Mody wrote:
[..] I am hoping that there is some agreement among compiler
writers as to what constitutes an acceptable level of optimization.
After all, there is technically no requirement for example that
a compiler inline any methods at all, but for some types of code
that would prove to be unacceptable.
There is no other "agreement" among compiler writers except the
Standard Document, I hope.
I'm using g++ on x86 Linux.


Good for you. It doesn't matter here, though.

V
Aug 4 '05 #4

P: n/a
Victor Bazarov wrote:
S.K.Mody wrote:
[..] I am hoping that there is some agreement among compiler
writers as to what constitutes an acceptable level of optimization.
After all, there is technically no requirement for example that
a compiler inline any methods at all, but for some types of code
that would prove to be unacceptable.
There is no other "agreement" among compiler writers except the
Standard Document, I hope.


Why do you hope? Would there be any problems if a subset
of the set of all compilers behaved somewhat predictably in
some respects even though the exact behaviour is left
unspecified by the standard?

V


--
Regards,
S.K.Mody
Aug 4 '05 #5

P: n/a
S.K.Mody wrote:
Victor Bazarov wrote:
S.K.Mody wrote:
[..] I am hoping that there is some agreement among compiler
writers as to what constitutes an acceptable level of optimization.
After all, there is technically no requirement for example that
a compiler inline any methods at all, but for some types of code
that would prove to be unacceptable.


There is no other "agreement" among compiler writers except the
Standard Document, I hope.


Why do you hope? Would there be any problems if a subset
of the set of all compilers behaved somewhat predictably in
some respects even though the exact behaviour is left
unspecified by the standard?


Yes. The problem is simple: if there is nothing _governing_
the behaviour, nothing is there to prevent it _change_ some
sunny day, and therefore none of it can be _relied upon_. What
else did you expect me to tell you?
Aug 4 '05 #6

P: n/a
Victor Bazarov wrote:
S.K.Mody wrote:
Victor Bazarov wrote:
S.K.Mody wrote:
[..] I am hoping that there is some agreement among compiler
writers as to what constitutes an acceptable level of optimization.
After all, there is technically no requirement for example that
a compiler inline any methods at all, but for some types of code
that would prove to be unacceptable.

There is no other "agreement" among compiler writers except the
Standard Document, I hope.


Why do you hope? Would there be any problems if a subset
of the set of all compilers behaved somewhat predictably in
some respects even though the exact behaviour is left
unspecified by the standard?


Yes. The problem is simple: if there is nothing _governing_
the behaviour, nothing is there to prevent it _change_ some
sunny day, and therefore none of it can be _relied upon_. What
else did you expect me to tell you?


I think the philosophy of C++ provides the governing principle
- To achieve the right balance between portability and efficiency.
For example without inlining, to which the original question is
closely related, it would often be unacceptably inefficient to
have deeply nested calls to small functions. But such calls may
be necessary for a variety of reasons related to good C++ design.
So should one go back to writing macros and forget about design
principles or can one compromise a little and ask for some informal
guarantees from the specific compiler (or class of compilers)
that one may be working with? The latter seems to me to be a better
option - since the choice is unmaintainable spaghetti code v/s
well designed code with some compiler specific preprocessing.

You may regard this as a strictly compiler related question but
it seems to me that the C++ efficiency goals virtually require
the compiler to provide such informal albeit non-portable
guarantees. The original question could therefore be rephrased
as "Is there any sort of uniformity among compilers in this regard?"
I'm not sure whether your answers were based on specific knowledge
of widely varying implementations or on the legal position of the
standard.

--
Regards,
S.K.Mody
Aug 4 '05 #7

P: n/a
S.K.Mody wrote:
I think the philosophy of C++ provides the governing principle
- To achieve the right balance between portability and efficiency.
For example without inlining, to which the original question is
closely related, it would often be unacceptably inefficient to
have deeply nested calls to small functions. But such calls may
be necessary for a variety of reasons related to good C++ design.
uh-huh.
So should one go back to writing macros and forget about design
principles or can one compromise a little and ask for some informal
guarantees from the specific compiler (or class of compilers)
that one may be working with? The latter seems to me to be a better
option - since the choice is unmaintainable spaghetti code v/s
well designed code with some compiler specific preprocessing.
If you know that you will never compile your program on anything other than the compiler you're using today, do what you like. But rest-assured that there are lots of people who
expected their code to not still be around 20 years later. (Such as that in banking systems... remember Y2K? It not like the programmers didn't know about the problem, they just
expected the code to not be around when the problem surfaced).

The point is that one day, somebody might try to re-use the code (I have heard of it happening in the past). They may not expect it to rely upon on-standard behaviour.
You may regard this as a strictly compiler related question but
it seems to me that the C++ efficiency goals virtually require
the compiler to provide such informal albeit non-portable
guarantees.
Not in the slightest. Optimisation is nice, but there is loads of code that relies upon side effects that should not exist, and optimising that code breaks it. Again, reliance
upon non-standard behaviour (such as doing anything in a copy-constructor that is not purely related to copying the object), might break your code some day.
The original question could therefore be rephrased
as "Is there any sort of uniformity among compilers in this regard?"
Yeah, of there is. But thats NOT the question. The question is "Should you rely upon it?"
I'm not sure whether your answers were based on specific knowledge
of widely varying implementations or on the legal position of the
standard.


Probably a bit (or more) of both.

Ben
--
I'm not just a number. To many, I'm known as a String...
Aug 8 '05 #8

This discussion thread is closed

Replies have been disabled for this discussion.