ju**********@yahoo.co.in wrote:
Hi,
Consider following piece of code:
int i = 0x12345678;
char c;
c = i;
printf("0x%x\n",c);
What value will be printed ?
As per K&R, longer integers are converted to shorter ones by dropping
the higher order bits. So, value printed should be "0x78".
In standard C, this is only true for unsigned integer types. On many
implementations this is also true for signed integer types, but if
possible, don't rely on it.
However, my question is that can this value be different on machines
with different endianness ?
It is not likely to depend on endianness. If both types are declared as
unsigned, and if the size of the types is what you expect they are
(you're assuming an int is large enough to store 0x12345678, and that
char only has 8 bits), then it would print 0x78 no matter how the value
is stored in memory. However, with signed types, "either the result is
implementation-defined or an implementation-defined signal is raised".
The output could be 0x78. But it's just as possible to have 0xFF
assigned to c, or 0, or to have the program abort.