On Fri, 28 Mar 2008 16:01:54 -0500, CBFalconer wrote:
Ioannis Vranos wrote:
>Under C95: Is it guaranteed that char, unsigned char, signed char have
no padding bits?
Just a note: padding bits are a concept introduced in the standard in
C99; C90/C95 left much more unspecified about the representation of
integer types.
unsigned char, yes.
Where is this guarantee made? In C99, 5.2.4.2.1 makes it as clear as it
can: "The value UCHAR_MAX shall equal 2^CHAR_BIT - 1." I don't have a
copy of an older standard. Does it make the same guarantee?
The others by implication.
How so? What's preventing a signed integer type and its corresponding
unsigned type from having a different number of padding bits?