"Novitas" <ke*@clement.na me> writes:
As a practical matter it is always defined, just not the same way from
implementati on to implementation.
<snip>
No, the behaviour is undefined. Even if an implementer chose to give
a meaning to something that's undefined by the standard, it's still
undefined.
I'll admit that I was being excessively cute with my answer. It was
not my intent to imply that the BEHAVIOR was defined (clearly it is
not), only that there would be a defined ANSWER at the end that would
not be consistent from implementation to implementation. This stands
in opposition to say x / 0 where depending on the state of floating
point exceptions might not produce any ANSWER at all (division by zero
being mathematically undefined) where "as a practical matter" i++ + i++
is just ambiguous. Still its bad, bad, bad...
Please provide proper attributions so we can tell who wrote what.
Even the groups.google.c om interface will do this for you if you
follow the standard advice:
If you want to post a followup via groups.google.c om, don't use
the broken "Reply" link at the bottom of the article. Click on
"show options" at the top of the article, then click on the
"Reply" at the bottom of the article headers.
The code in question was;
int i = 10;
int a = i++ + i++;
This is not merely ambiguous, it invokes undefined behavior. This
means that there can be no defined answer. As far as the language is
concerned, it could store any value in a (20, 21, 22, 137, 0xdeadbeef,
or a banana split), or it could fail to store any value in a, or it
could cause the program to abort, or it could cause the memory chip
holding the value of a to explode into a colorful mist of adverbs, or,
classically, it could make demons fly out your nose. It is undefined
behavior in exactly the same sense that x/0 is undefined behavior.
Now it's likely that on most real-world implementations , the result
will be that a takes on the value 20 or 21; after all, those are just
two of the infinitely many possible consequences of undefined
behavior.
It's also possible that a sufficiently clever compiler will recognize
the undefined behavior (though it's not required to) and refuse to
compile the program.
That's what the standard says, but it doesn't quite explain why.
There are two answers to that. (Well, three if you count "because the
standard says so", but that's not very satisfying.)
One is that there are probably some things defined as undefined
behavior that don't really need to be. The standard *could* place
tighter constraints on some things. The problem is that it's not
clear just what those things are. If the committee had taken the time
to go through every instance of undefined behavior and argue about
whether it can be "fixed", they wouldn't have had time to produce the
standard.
Another has to do with optimization. An optimizer is typically an
optional phase of a multi-pass compiler; it takes as input some
intermediate form of the program, and produces as output the same
intermediate form, but with transformations that presumably make it
more efficient. An optimizer is constrained by correctness
requirements; it can't take a valid program that does one thing and
turn it into a program that does something else. But in proving that
a transformation can't break the program, the optimizer is allowed to
assume that the program doesn't invoke undefined behavior (because if
it does, the program is already broken anyway, and breaking it further
is ok). If that assumption is incorrect, anything can happen.
The standard deliberately allows optimizers to mangle bad code (code
that invokes undefined behavior) so they can do as good a job as
possible on good code (code that doesn't invoke undefined behavior).
--
Keith Thompson (The_Other_Keit h)
ks***@mib.org <http://www.ghoti.net/~kst>
San Diego Supercomputer Center <*> <http://users.sdsc.edu/~kst>
We must do something. This is something. Therefore, we must do this.