473,883 Members | 1,956 Online
Bytes | Software Development & Data Engineering Community
+ Post

Home Posts Topics Members FAQ

to calculate bitsize of a byte

I am reading "Joel on Software" these days, and am in stuck with the
question of "how to calculate bitsize of a byte" which is listed as one
of the basic interview questions in Joel's book. Anyone could give some
ideas?I am expecting your reply.
David.

Feb 21 '06
96 5002
"Richard G. Riley" <rg***********@ gmail.com> writes:
On 2006-02-21, Jordan Abel <ra*******@gmai l.com> wrote:
On 2006-02-21, Richard G. Riley <rg***********@ gmail.com> wrote:
On 2006-02-21, Al Balmer <al******@att.n et> wrote: [...] Consider the possibility that you might want to calculate the size of
a character at run-time.

That is a lot easier than a BYTE isnt it?

Just left shift a bit x times on a defined "char" and detect it zeroing.


Except for overflow. And if this _does_ work, it will result in the same
answer as CHAR_BIT.


I would hope so : I was more referring to the fact that a lot of
people wanted to muddy the waters with run time calculations of byte
size as opposed to compile time constants. "char" is easier because it
is a defined type so doing the bit shift really can and does work
without any worries of "purity" ...


If char is signed, a left shift that overflows invokes undefined
behavior. You can reliably shift an *unsigned* char to determine the
number of bits.

#include<stdio. h>
#include<limits .h>
int main(void)
{
int bits = 0;
unsigned char x = 1;
while (x != 0) {
x <<= 1;
bits ++;
}
printf("bits = %d\n", bits);
if (bits != CHAR_BIT) {
printf("CHAR_BI T = %d (OOPS!)\n", CHAR_BIT);
}
return 0;
}

--
Keith Thompson (The_Other_Keit h) ks***@mib.org <http://www.ghoti.net/~kst>
San Diego Supercomputer Center <*> <http://users.sdsc.edu/~kst>
We must do something. This is something. Therefore, we must do this.
Feb 21 '06 #21
Rod Pemberton failed his job interview when he wrote:
"Martin Ambuhl" <ma*****@earthl ink.net> wrote in message
news:AR******** *********@newsr ead2.news.atl.e arthlink.net...
Rod Pemberton wrote:
You are the only one to get it correct so far. Martin and
Vladimir both failed. The question was how to _calculate_ the
bits in a byte. Looking up CHAR_BIT is not a calculation.
Sure it is. Grow up.
The OP also said it was an _interview_ question. If it came from Microsoft,
you'd be expected to _calculate_ it, not look it up. If you fail a simple
example like this on a job interview, your chances of getting the job
decrease. That's reality, whether you think it's immature or not.


If you create a program to satisfy your childish idea of what
"calculate" means, the interviewer will know that you neither know C nor
do you have a clue about how to program. Successful programming
involves not doing extra work, spinning your wheels, just to avoid using
the obviously available.
Feb 21 '06 #22
david ullua a écrit :
I am reading "Joel on Software" these days, and am in stuck with the
question of "how to calculate bitsize of a byte" which is listed as one
of the basic interview questions in Joel's book. Anyone could give some
ideas?I am expecting your reply.


Quite simple.
1 - Include <limits.h>
2 - printout the value of CHAR_BIT.
Done.

--
C is a sharp tool
Feb 21 '06 #23
david ullua a écrit :
Thanks for both of your reply.
I take a look at limits.h and other head files, there is really
something need to be read.


/Your/ C-book needs to be read from cover to cover. every 6 month.

--
C is a sharp tool
Feb 21 '06 #24
Keith Thompson wrote:
If char is signed, a left shift that overflows invokes undefined
behavior. You can reliably shift an *unsigned* char to determine the
number of bits.


Not if UCHAR_MAX equals INT_MAX, which it may.

--
pete
Feb 21 '06 #25
On 2006-02-21, pete <pf*****@mindsp ring.com> wrote:
Keith Thompson wrote:
If char is signed, a left shift that overflows invokes undefined
behavior. You can reliably shift an *unsigned* char to determine the
number of bits.


Not if UCHAR_MAX equals INT_MAX, which it may.


Eh?

No, because you can keep left-shifting the unsigned char and the
behavior of an unsigned type on "overflow" is perfectly well-defined

what does int have to do with anything?
Feb 21 '06 #26
"sl*******@yaho o.com" <sl*******@gmai l.com> wrote in message
news:11******** ************@g4 3g2000cwa.googl egroups.com...
To complicate matters, char is not required by the C standard to be the
machine "byte". It is only required to be at least 8 bits. And a
machine "byte" is not always 8 bits. And some ancient beasts even
enables the programmer to specify how many bits are in a byte. So on
those machines (I believe Unisys was one) you "defined" how many bits
was in a byte rather than test it.

So you are saying that there are two bytes: the "machine byte" and the "C
implementation byte"? I assume that because the C standard clearly states
that char is equivalent to byte.
Your code fails on a machine with 6 bit bytes and a C compiler with 12
bit chars (remember 6 bit bytes are not allowed by the standard).


How can such as system exist then?
Feb 21 '06 #27
Jordan Abel wrote:

On 2006-02-21, pete <pf*****@mindsp ring.com> wrote:
Keith Thompson wrote:
If char is signed, a left shift that overflows invokes undefined
behavior. You can reliably shift an *unsigned* char to determine the
number of bits.


Not if UCHAR_MAX equals INT_MAX, which it may.


Eh?

No, because you can keep left-shifting the unsigned char and the
behavior of an unsigned type on "overflow" is perfectly well-defined

what does int have to do with anything?


Look up "integer promotions"

If INT_MAX is greater than or equal to UCHAR_MAX
then the shift operator will cause the unsigned char expression
to be promoted to type int.

If INT_MAX is equal to UCHAR_MAX,
then the final iteration is equivalent to
byte = INT_MAX << 1

unsigned short has the same problem.
You can't portably increment unsigned types until they roll over,
if they are lower ranking than int.

--
pete
Feb 21 '06 #28

"david ullua" wrote:
I am reading "Joel on Software" these days, and am in stuck with the
question of "how to calculate bitsize of a byte" which is listed as one
of the basic interview questions in Joel's book. Anyone could give some
ideas?I am expecting your reply.
David.


Try this:

unsigned char x = 0;
unsigned int y = INT_MAX; /*assuming less than MAX_INT bits*/

while( y )
{
x |= 1;
x <<= 1
}

while( x >>= 1 )
y++;

now y should give the number of bits in a char...

regards
John.
Feb 21 '06 #29
stathis gotsis wrote:
"sl*******@yaho o.com" <sl*******@gmai l.com> wrote in message
news:11******** ************@g4 3g2000cwa.googl egroups.com...
To complicate matters, char is not required by the C standard to be the
machine "byte". It is only required to be at least 8 bits. And a
machine "byte" is not always 8 bits. And some ancient beasts even
enables the programmer to specify how many bits are in a byte. So on
those machines (I believe Unisys was one) you "defined" how many bits
was in a byte rather than test it.

So you are saying that there are two bytes: the "machine byte" and the "C
implementation byte"? I assume that because the C standard clearly states
that char is equivalent to byte.


Yes, which is why it's best to use the term 'byte' in the C language
sense,
and terms like octet or machine word for other senses.
Your code fails on a machine with 6 bit bytes and a C compiler with 12
bit chars (remember 6 bit bytes are not allowed by the standard).


How can such as system exist then?


The same way a C90 program can construct 64-bit pseudo ints from two
unsigned longs (which are only guaranteed to be at least 32-bits each.)

--
Peter

Feb 21 '06 #30

This thread has been closed and replies have been disabled. Please start a new discussion.

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.