In article <11**********************@i39g2000cwa.googlegroups .com>,
gamehack <ga******@gmail.com> wrote:
I was debugging some code today and a few sign problems popped up. So I
wondered how are assignments defined between variable with different
signness. For example:
unsigned char a = 10;
signed char b;
b = a;
Or does the actual signness matter? The only significance I can think
of is how the numbers are interpretered so the actual assignment just
copies the bits. Am I right?
No, it does matter. The result is well defined when one is going
*to* unsigned, but the result is implementation defined when one
is going from unsigned to signed, unless the value fits anyhow.
In practice, likely on most machines you will encounter, the bits
will just be copied, but that's up to the implementation. A student-
oriented compiler might deliberately fault, for example.
In C there are three possible representations of integral values,
and the "copy the bits" heuristic is only valid for one of the three.
It is possible that you may never encounter an actual system with
one of the two other valid representations, but it could happen.
--
Okay, buzzwords only. Two syllables, tops. -- Laurie Anderson