By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
446,148 Members | 1,285 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 446,148 IT Pros & Developers. It's quick & easy.

sizeof operator

P: n/a
JS
I read in K&R page 204 that sizeof use on a char returns 1. But when I write
the following I get 4!

int main(void){
printf("%d\n",sizeof('g'));

}

Was it not supposed to return 1?
Nov 14 '05 #1
Share this Question
Share on Google+
36 Replies


P: n/a
"JS" <dsa.@asdf.com> writes:
I read in K&R page 204 that sizeof use on a char returns 1. But when I write
the following I get 4!

printf("%d\n",sizeof('g'));


In C, character constants have type `int', so sizeof 'g' is equal
to sizeof(int).
--
Ben Pfaff
email: bl*@cs.stanford.edu
web: http://benpfaff.org
Nov 14 '05 #2

P: n/a
JS

"Ben Pfaff" <bl*@cs.stanford.edu> skrev i en meddelelse
news:87************@benpfaff.org...
"JS" <dsa.@asdf.com> writes:
I read in K&R page 204 that sizeof use on a char returns 1. But when I write the following I get 4!

printf("%d\n",sizeof('g'));


In C, character constants have type `int', so sizeof 'g' is equal
to sizeof(int).

Why do they write char if they mean int?
Nov 14 '05 #3

P: n/a
"JS" <dsa.@asdf.com> wrote in message news:d1**********@news.net.uni-c.dk...
I read in K&R page 204 that sizeof use on a char returns 1. But when I
write the following I get 4!

int main(void){
printf("%d\n",sizeof('g'));

}

Was it not supposed to return 1?


No, a character in single quotes, as in 'g', is a constant of type signed
int in C (I think this is different in C++).

Try:

int main(void) {
char c;
printf("%d\n", (int)sizeof c);
return 0;
}

Alex
Nov 14 '05 #4

P: n/a
JS <dsa.@asdf.com> scribbled the following:
"Ben Pfaff" <bl*@cs.stanford.edu> skrev i en meddelelse
news:87************@benpfaff.org...
"JS" <dsa.@asdf.com> writes:
> I read in K&R page 204 that sizeof use on a char returns 1. But when I write > the following I get 4!
>
> printf("%d\n",sizeof('g'));
In C, character constants have type `int', so sizeof 'g' is equal
to sizeof(int).

Why do they write char if they mean int?


They don't. sizeof(char) is 1, but sizeof('g') is sizeof(int), which is
4 on your platform. 'g' is not a char. It's an int. Like Ben Pfaff
said, character constants are ints, not chars.

--
/-- Joona Palaste (pa*****@cc.helsinki.fi) ------------- Finland --------\
\-------------------------------------------------------- rules! --------/
"The day Microsoft makes something that doesn't suck is probably the day they
start making vacuum cleaners."
- Ernst Jan Plugge
Nov 14 '05 #5

P: n/a
JS wrote:
I read in K&R page 204 that sizeof use on a char returns 1. But when I write
the following I get 4!

int main(void){
printf("%d\n",sizeof('g'));

}

Was it not supposed to return 1?


You have stumbled upon a difference between C and C++.
In C, literal 'g' is an int, while in C++ literal 'g' is a char.
Here are the C and C++ forms of the code, with the output of each.

#include <stdio.h>
#define COMPILER "<your compiler here> (C)"

int main(void)
{
printf(COMPILER " %d\n", (int) sizeof 'g');
return 0;
}

#include <cstdio>
using namespace std;
#define COMPILER "<your compiler here> (C++)"

int main(void)
{
printf(COMPILER " %d\n", (int) sizeof 'g');
return 0;
}

[outputs]
gcc (C) 4
bcc (C) 4
gcc (C++) 1
bcc (C++) 1
Nov 14 '05 #6

P: n/a
"JS" <dsa.@asdf.com> writes:
"Ben Pfaff" <bl*@cs.stanford.edu> skrev i en meddelelse
news:87************@benpfaff.org...
"JS" <dsa.@asdf.com> writes:
> I read in K&R page 204 that sizeof use on a char returns 1. But when I write > the following I get 4!
>
> printf("%d\n",sizeof('g'));


In C, character constants have type `int', so sizeof 'g' is equal
to sizeof(int).


Why do they write char if they mean int?


When do K&R do that?
--
Ben Pfaff
email: bl*@cs.stanford.edu
web: http://benpfaff.org
Nov 14 '05 #7

P: n/a
"JS" <dsa.@asdf.com> writes:
I read in K&R page 204 that sizeof use on a char returns 1. But when I write
the following I get 4!

int main(void){
printf("%d\n",sizeof('g'));

}

Was it not supposed to return 1?


Contrary to what you might expect, a character constant such as 'g'
is an int value. So sizeof('g') == sizeof(int).

If you do sizeof( (char)'g' ) you should get 1.

Nov 14 '05 #8

P: n/a
Alex Fraser wrote:
No, a character in single quotes, as in 'g', is a constant of type signed
int in C (I think this is different in C++).

Try:

int main(void) {
char c;
printf("%d\n", (int)sizeof c);
Don't you mean:
printf("%u\n", sizeof c);
return 0;
}

Alex

Nov 14 '05 #9

P: n/a
Ben Pfaff <bl*@cs.stanford.edu> spoke thus:
In C, character constants have type `int', so sizeof 'g' is equal
to sizeof(int).


(semi OT) Why did the authors of the standard make this arguably
non-intuitive stipulation?

--
Christopher Benson-Manica | I *should* know what I'm talking about - if I
ataru(at)cyberspace.org | don't, I need to know. Flames welcome.
Nov 14 '05 #10

P: n/a
Mark Odell <od*******@hotmail.com> writes:
Alex Fraser wrote:
printf("%d\n", (int)sizeof c);

Don't you mean:
printf("%u\n", sizeof c);


No, the result of sizeof has type size_t, which may not be type
unsigned.
--
int main(void){char p[]="ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuv wxyz.\
\n",*q="kl BIcNBFr.NKEzjwCIxNJC";int i=sizeof p/2;char *strchr();int putchar(\
);while(*q){i+=strchr(p,*q++)-p;if(i>=(int)sizeof p)i-=sizeof p-1;putchar(p[i]\
);}return 0;}
Nov 14 '05 #11

P: n/a
In article <d1**********@chessie.cirr.com>, Christopher Benson-Manica wrote:
Ben Pfaff <bl*@cs.stanford.edu> spoke thus:
In C, character constants have type `int', so sizeof 'g' is equal
to sizeof(int).


(semi OT) Why did the authors of the standard make this arguably
non-intuitive stipulation?


Because you may be able to put more than one character into character
constants (but the actual number that fits into it is implementation-
defined any may be 1.)

putchar('foo');

This is only in ANSI/ISO C for compatibility with traditional C, and I
don't think anybody uses this feature anymore. The value of a multi-char
character constant is implementation-defined, so it is not portable at
all (gcc chooses to ignore all but the last character in such a
constant, so the outcome of 'foo' would be 'o'. In addition, gcc always
issues a diagnostic for multi-char character constants because it is so
unlikely that you really meant to use one.)

--
My real email address is ``nils<at>gnulinux<dot>nl''
Nov 14 '05 #12

P: n/a
Mark Odell wrote:
Alex Fraser wrote:
No, a character in single quotes, as in 'g', is a constant of type signed
int in C (I think this is different in C++).

Try:

int main(void) {
char c;
printf("%d\n", (int)sizeof c);

Don't you mean:
printf("%u\n", sizeof c);


No, since there is no reason to think that a size_t is an unsigned int.
The "correct" specifier for an uncast size_t might be "%lu" or "%llu".
Nov 14 '05 #13

P: n/a
Nils Weller <me@privacy.net> writes:
In article <d1**********@chessie.cirr.com>, Christopher Benson-Manica wrote:
Ben Pfaff <bl*@cs.stanford.edu> spoke thus:
In C, character constants have type `int', so sizeof 'g' is equal
to sizeof(int).


(semi OT) Why did the authors of the standard make this arguably
non-intuitive stipulation?


Because you may be able to put more than one character into character
constants (but the actual number that fits into it is implementation-
defined any may be 1.)


That is not a cause, that is an effect. If a character constant
had type `char', then by definition you could only fit one `char'
into it.
--
int main(void){char p[]="ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuv wxyz.\
\n",*q="kl BIcNBFr.NKEzjwCIxNJC";int i=sizeof p/2;char *strchr();int putchar(\
);while(*q){i+=strchr(p,*q++)-p;if(i>=(int)sizeof p)i-=sizeof p-1;putchar(p[i]\
);}return 0;}
Nov 14 '05 #14

P: n/a
Hey, Ben, long time no see.

Your sig block program -- I can't compile it and I can't figure out
why.

I get:

D:\Steve\mcc>mingw32-gcc -c sigc0.c -o sigc0.o
sigc0.c: In function `main':
sigc0.c:1: stray '\255' in program

I broke it out to multiline format:

int main(void){
char p[]="ABCDEFGHIJKLMNOPQRSTUVWXY*Zabcdefghijklmnopqrstu vwxyz.
\n",*q="kl BIcNBFr.NKEzjwCIxNJC";

int i=sizeof p/2;

char *strchr();

int putchar();
while(*q) {
i+=strchr(p,*q++)-*p;
if(i>=(int)sizeof p)i-=sizeof p-1;
putchar(p[i]);
}
return 0;
}

And it now reports the problem on line 10:

D:\Steve\mcc>mingw32-gcc -c sigc.c -o sigc.o
sigc.c: In function `main':
sigc.c:10: stray '\255' in program

Commenting out line 10 does, in fact permit it to compile.

I've dumped out the source file in hex, ascii and decimal, no 255
character in the source code.

What gives?

Nov 14 '05 #15

P: n/a
In article <87************@benpfaff.org>, Ben Pfaff wrote:
Nils Weller <me@privacy.net> writes:
In article <d1**********@chessie.cirr.com>, Christopher Benson-Manica wrote:
Ben Pfaff <bl*@cs.stanford.edu> spoke thus:

In C, character constants have type `int', so sizeof 'g' is equal
to sizeof(int).

(semi OT) Why did the authors of the standard make this arguably
non-intuitive stipulation?


Because you may be able to put more than one character into character
constants (but the actual number that fits into it is implementation-
defined any may be 1.)


That is not a cause, that is an effect. If a character constant
had type `char', then by definition you could only fit one `char'
into it.


I thought this is a newsgroup about the C programming language, and not
about linguistics. The intent of what I said is obvious to anyone who
does not intentionally misinterpret it. But okay, I'll rephrase:

Because character constants were designed to be capable of carrying more
more than one character (and it follows, obviously, that a ``char'' is
not enough to carry more than one character - hence the need for a
larger data type.)

--
My real email address is ``nils<at>gnulinux<dot>nl''
Nov 14 '05 #16

P: n/a
"Steven K. Mariner" <ma*******@earthlink.net> writes:
Your sig block program -- I can't compile it and I can't figure out
why.

D:\Steve\mcc>mingw32-gcc -c sigc0.c -o sigc0.o
sigc0.c: In function `main':
sigc0.c:1: stray '\255' in program


I think your compiler must be buggy.
--
"I ran it on my DeathStation 9000 and demons flew out of my nose." --Kaz
Nov 14 '05 #17

P: n/a
Nils Weller <me@privacy.net> writes:
Because character constants were designed to be capable of carrying more
more than one character (and it follows, obviously, that a ``char'' is
not enough to carry more than one character - hence the need for a
larger data type.)


Okay--so, then, *why* were character constants designed to be
capable of carrying more than one character?
--
"...deficient support can be a virtue.
It keeps the amateurs off."
--Bjarne Stroustrup
Nov 14 '05 #18

P: n/a
Steven K. Mariner <ma*******@earthlink.net> wrote:
Hey, Ben, long time no see. Your sig block program -- I can't compile it and I can't figure out
why. I get: D:\Steve\mcc>mingw32-gcc -c sigc0.c -o sigc0.o
sigc0.c: In function `main':
sigc0.c:1: stray '\255' in program I broke it out to multiline format: int main(void){
char p[]="ABCDEFGHIJKLMNOPQRSTUVWXY*Zabcdefghijklmnopqrstu vwxyz.
\n",*q="kl BIcNBFr.NKEzjwCIxNJC"; int i=sizeof p/2; char *strchr(); int putchar();
while(*q) {
i+=strchr(p,*q++)-*p;
if(i>=(int)sizeof p)i-=sizeof p-1;
putchar(p[i]);
}
return 0;
}
You have managed to get some strange character into it. On the line

i+=strchr(p,*q++)-*p;

the character directly in front of the 'p' at the end seems to be the
culprit - it wasn't in Bens signature and is some non-ASCII character
(it arrives here as 0xAD, looking like a '-' but that might be an arte-
fact of the newsreader). But also the line
char p[]="ABCDEFGHIJKLMNOPQRSTUVWXY*Zabcdefghijklmnopqrstu vwxyz.
\n",*q="kl BIcNBFr.NKEzjwCIxNJC";


is dangerous (the space before the '\n' might get lost and and you
would need a '\' at the end of the first line), better make that

char p[]="ABCDEFGHIJKLMNOPQRSTUVWXY*Zabcdefghijklmnopqrstu vwxyz. \n",
*q="kl BIcNBFr.NKEzjwCIxNJC";

Regards, Jens
--
\ Jens Thoms Toerring ___ Je***********@physik.fu-berlin.de
\__________________________ http://www.toerring.de
Nov 14 '05 #19

P: n/a
In article <87************@benpfaff.org>, Ben Pfaff wrote:
Nils Weller <me@privacy.net> writes:
Because character constants were designed to be capable of carrying more
more than one character (and it follows, obviously, that a ``char'' is
not enough to carry more than one character - hence the need for a
larger data type.)


Okay--so, then, *why* were character constants designed to be
capable of carrying more than one character?


That I don't know. I'd guess that it was intended to automize bit
shifting for you. The value of a multi-char character constant is now
implementation-defined, but historically you really got all chars you
asked for:

'foo'

would turn into

'f' << (CHAR_BIT * 2) | 'o' << CHAR_BIT | 'o'

This may seem more plausible for values written in hexadecimal notation:

'\xff\x12'

would turn into

0xff << CHAR_BIT | 0x12

--
My real email address is ``nils<at>gnulinux<dot>nl''
Nov 14 '05 #20

P: n/a
In article <3a*************@individual.net>, Nils Weller wrote:
In article <87************@benpfaff.org>, Ben Pfaff wrote:
Nils Weller <me@privacy.net> writes:
Because character constants were designed to be capable of carrying more
more than one character (and it follows, obviously, that a ``char'' is
not enough to carry more than one character - hence the need for a
larger data type.)


Okay--so, then, *why* were character constants designed to be
capable of carrying more than one character?


That I don't know. I'd guess that it was intended to automize bit
shifting for you. The value of a multi-char character constant is now
implementation-defined, but historically you really got all chars you
asked for:

'foo'

would turn into

'f' << (CHAR_BIT * 2) | 'o' << CHAR_BIT | 'o'

This may seem more plausible for values written in hexadecimal notation:

'\xff\x12'

would turn into

0xff << CHAR_BIT | 0x12


Actually it makes a lot more sense for values written in octal notation
since 0xff << CHAR_BIT | 0x12 can be represented as a simple 0xff12 as
well.
--
My real email address is ``nils<at>gnulinux<dot>nl''
Nov 14 '05 #21

P: n/a
JS wrote on 23/03/05 :
I read in K&R page 204 that sizeof use on a char returns 1. But when I write
Correct.
the following I get 4!

int main(void){
printf("%d\n",sizeof('g'));

}

Was it not supposed to return 1?


No, because a character literal is not a char. It fits into a char. Big
difference.

Actually, a character literal is an int.

--
Emmanuel
The C-FAQ: http://www.eskimo.com/~scs/C-faq/faq.html
The C-library: http://www.dinkumware.com/refxc.html

"Mal nommer les choses c'est ajouter du malheur au
monde." -- Albert Camus.

Nov 14 '05 #22

P: n/a
Mark Odell wrote on 23/03/05 :
char c;
printf("%d\n", (int)sizeof c);


Don't you mean:
printf("%u\n", sizeof c);
return 0;
}


Assuming <stdio.h> was included, the first version was correct. Yours
is not, because a size_t could be an unsigned long. You want :

printf("%u\n", (unsigned) sizeof c);

or (C99)

printf("%zu\n", sizeof c);

--
Emmanuel
The C-FAQ: http://www.eskimo.com/~scs/C-faq/faq.html
The C-library: http://www.dinkumware.com/refxc.html

"C is a sharp tool"

Nov 14 '05 #23

P: n/a
Steven K. Mariner wrote on 23/03/05 :
char p[]="ABCDEFGHIJKLMNOPQRSTUVWXY*Zabcdefghijklmnopqrstu vwxyz.
\n",*q="kl BIcNBFr.NKEzjwCIxNJC";

Put that on a single line, and cut at the comma...

char p[]="ABCDEFGHIJKLMNOPQRSTUVWXY*Zabcdefghijklmnopqrstu vwxyz.\n"
,*q="kl BIcNBFr.NKEzjwCIxNJC";

--
Emmanuel
The C-FAQ: http://www.eskimo.com/~scs/C-faq/faq.html
The C-library: http://www.dinkumware.com/refxc.html

"C is a sharp tool"

Nov 14 '05 #24

P: n/a
Martin Ambuhl wrote:
Mark Odell wrote:
Alex Fraser wrote:
No, a character in single quotes, as in 'g', is a constant of type
signed
int in C (I think this is different in C++).

Try:

int main(void) {
char c;
printf("%d\n", (int)sizeof c);


Don't you mean:
printf("%u\n", sizeof c);

No, since there is no reason to think that a size_t is an unsigned int.
The "correct" specifier for an uncast size_t might be "%lu" or "%llu".


Addendum:
In C99, we have "%zu" specifically for size_t.
No more guesswork :-)
Cheers
Michael
--
E-Mail: Mine is an /at/ gmx /dot/ de address.
Nov 14 '05 #25

P: n/a
Nils Weller <me@privacy.net> writes:
In article <d1**********@chessie.cirr.com>, Christopher Benson-Manica wrote:
Ben Pfaff <bl*@cs.stanford.edu> spoke thus:
In C, character constants have type `int', so sizeof 'g' is equal
to sizeof(int).


(semi OT) Why did the authors of the standard make this arguably
non-intuitive stipulation?


Because you may be able to put more than one character into character
constants (but the actual number that fits into it is implementation-
defined any may be 1.)

putchar('foo');

[...]

I don't think that's the reason.

In C, especially in older versions such as K&R C, there are few if any
expressions of integer types shorter than int. Evaluating an object
of type char or short almost always causes an implicit conversion to
int. (I might be wrong about the "almost".) Making character
constants have type char would actually make things more complicated.

For example:

char c;
c = 'g';

If 'g' were of type char, the value would be implicitly converted to
type int before being converted back to type char and assigned to c.
With 'g' being of type int, there's only one implicit conversion, not
two. (Of course the compiler is likely to optimize the whole thing to
a single "store byte" instruction, if such an instruction exists.)

The only case I can think of where this visibly matters is applying
the sizeof operator to a character constant, something that is rarely
useful.

<OT>
In C++, character constants are of type char; this is because of some
considerations involving overloading and/or templates, which don't
apply in C.
</OT>

Incidentally, the OP's question could have been answered by citing
question 8.9 in the C FAQ.

--
Keith Thompson (The_Other_Keith) ks***@mib.org <http://www.ghoti.net/~kst>
San Diego Supercomputer Center <*> <http://users.sdsc.edu/~kst>
We must do something. This is something. Therefore, we must do this.
Nov 14 '05 #26

P: n/a
<posted & mailed>

sizeof() returns the storage size of the argument specified. Since 'g' is an
integer constant, you get sizeof(int). On your platform... that's 4.

JS wrote:
I read in K&R page 204 that sizeof use on a char returns 1. But when I
write the following I get 4!

int main(void){
printf("%d\n",sizeof('g'));

}

Was it not supposed to return 1?


--
Remove '.nospam' from e-mail address to reply by e-mail
Nov 14 '05 #27

P: n/a
On Wed, 23 Mar 2005 18:08:25 +0000 (UTC), Christopher Benson-Manica
<at***@nospam.cyberspace.org> wrote in comp.lang.c:
Ben Pfaff <bl*@cs.stanford.edu> spoke thus:
In C, character constants have type `int', so sizeof 'g' is equal
to sizeof(int).


(semi OT) Why did the authors of the standard make this arguably
non-intuitive stipulation?


Let's just assume ASCII for a moment, to simplify:

char c1 = 65;
char c2 = 'A';
What is the type of the initializer for c1? What is the type of the
initializer for c2? Why should they not be the same type? Either can
fit in a char.

....or even more succinctly:

char c1 = 0;
char c2 = '\0';

Note that all constant expressions of integral type are of type int or
higher rank. The rules for character literals are no different than
those for other integer literals. Unsuffixed integer literals that
have representable in an int are of type int, not of the smallest type
that can hold the value.

Consider that the only differences between 'A', 65, 0x41, and 081
exist syntactically in the way they are parsed at run time. They are
all one and exactly the same thing at run time, integer literals of
type int.

This is all very consistent with C's basic philosophy that there is
nothing special about characters, they are just integer types holding
numbers. As opposed to languages where one has to use special
functions (CHR$ and ASC come to mind from old interpreted BASIC
dialects) to handle the conversion between characters and their
numeric values.

--
Jack Klein
Home: http://JK-Technology.Com
FAQs for
comp.lang.c http://www.eskimo.com/~scs/C-faq/top.html
comp.lang.c++ http://www.parashift.com/c++-faq-lite/
alt.comp.lang.learn.c-c++
http://www.contrib.andrew.cmu.edu/~a...FAQ-acllc.html
Nov 14 '05 #28

P: n/a
Jack Klein wrote:
On Wed, 23 Mar 2005 18:08:25 +0000 (UTC), Christopher Benson-Manica
<at***@nospam.cyberspace.org> wrote in comp.lang.c:

Ben Pfaff <bl*@cs.stanford.edu> spoke thus:

In C, character constants have type `int', so sizeof 'g' is equal
to sizeof(int).
(semi OT) Why did the authors of the standard make this arguably
non-intuitive stipulation?

Let's just assume ASCII for a moment, to simplify:

char c1 = 65;
char c2 = 'A';
What is the type of the initializer for c1? What is the type of the
initializer for c2? Why should they not be the same type? Either can
fit in a char.

...or even more succinctly:

char c1 = 0;
char c2 = '\0';

Note that all constant expressions of integral type are of type int or
higher rank. The rules for character literals are no different than
those for other integer literals. Unsuffixed integer literals that
have representable in an int are of type int, not of the smallest type
that can hold the value.

Consider that the only differences between 'A', 65, 0x41, and 081
exist syntactically in the way they are parsed at run time. They are


ITYM: compile time (here, as opposed to run time below).
all one and exactly the same thing at run time, integer literals of
type int.

This is all very consistent with C's basic philosophy that there is
nothing special about characters, they are just integer types holding
numbers. As opposed to languages where one has to use special
functions (CHR$ and ASC come to mind from old interpreted BASIC
dialects) to handle the conversion between characters and their
numeric values.

--
E-Mail: Mine is an /at/ gmx /dot/ de address.
Nov 14 '05 #29

P: n/a
Jack Klein <ja*******@spamcop.net> spoke thus:
...or even more succinctly: char c1 = 0;
char c2 = '\0';


That was, indeed, very succinct. Thanks for a highly enlightening
article :)

--
Christopher Benson-Manica | I *should* know what I'm talking about - if I
ataru(at)cyberspace.org | don't, I need to know. Flames welcome.
Nov 14 '05 #30

P: n/a


JS wrote:
"Ben Pfaff" <bl*@cs.stanford.edu> skrev i en meddelelse
news:87************@benpfaff.org...
"JS" <dsa.@asdf.com> writes:
I read in K&R page 204 that sizeof use on a char returns 1. But when I write the following I get 4!

printf("%d\n",sizeof('g'));


In C, character constants have type `int', so sizeof 'g' is equal
to sizeof(int).


Why do they write char if they mean int?


lookup Integer Promotion
Nov 14 '05 #31

P: n/a
Perhaps.

Manual examination noted that a '-' got translated to '--' somehow
during my transfering of it to a text file. When I switched from MinGW
to an old bash shell version of gcc, I got a more intelligent error
about trying to modify an lvalue. That plus a comment below by Jens
helped me find the problem, and when I fixed it so my text file more
closely matched your source, it compiled just fine.

I thought it would simply be a cute display message, but I was dying to
see the message without having to completely reverse-engineer it.

And I was right. It's a cute little display from a fairly basic cipher
Nice.

Nov 14 '05 #32

P: n/a
Strange character was inserted (as best I can deduce) by cut-n-paste
processes. In the final analysis, a single minus sign translated
somehow to a double (probably some kind of auto-adjust courtesy of the
Microsoft products I'm using in this environment [*gag*]).

Corrected it by hand and the program now compiles and functions
correctly.

Thanks for the note.

Nov 14 '05 #33

P: n/a
Neil Kurzman wrote:
JS wrote:
"Ben Pfaff" <bl*@cs.stanford.edu> skrev:
"JS" <dsa.@asdf.com> writes:

I read in K&R page 204 that sizeof use on a char returns 1.
But when I write the following I get 4!

printf("%d\n",sizeof('g'));

In C, character constants have type `int', so sizeof 'g' is equal
to sizeof(int).


Why do they write char if they mean int?


lookup Integer Promotion


This has nothing to do with integer promotion. The argument
to 'sizeof' does not undergo any promotions or conversions.

Nov 14 '05 #34

P: n/a
Jack Klein wrote:

Let's just assume ASCII for a moment.

Consider that the only differences between 'A', 65, 0x41,
and 081 exist syntactically in the way they are parsed


081 ?

Nov 14 '05 #35

P: n/a
"Old Wolf" <ol*****@inspire.net.nz> writes:
Neil Kurzman wrote:
JS wrote:
"Ben Pfaff" <bl*@cs.stanford.edu> skrev:
"JS" <dsa.@asdf.com> writes:

> I read in K&R page 204 that sizeof use on a char returns 1.
> But when I write the following I get 4!
>
> printf("%d\n",sizeof('g'));

In C, character constants have type `int', so sizeof 'g' is equal
to sizeof(int).

Why do they write char if they mean int?


lookup Integer Promotion


This has nothing to do with integer promotion. The argument
to 'sizeof' does not undergo any promotions or conversions.


It's relevant, but only very indirectly. Since expressions of type
char are almost always promoted to int, there would be little point
(in C) in making character constants be of type char rather than of
type int.

This is part of the rationale for why the standard is the way it is;
it's not something from which you can logically infer that character
constants are of type int. The standard could just as easily have
said that character constants are of type char; they would then be
immediately promoted to int, so it would make a difference only when
the character constant is the operand of sizeof.

(Note that if plain char were unsigned, and sizeof(int)==1 (which
requires CHAR_BIT>=16), character constants of type char would promote
to unsigned int rather than int, but that doesn't apply to most
implementations.)

The real answer to "Why are character constants of type int rather
than char?" is "Because the standard says so". *Why* the standard
says so is another question.

--
Keith Thompson (The_Other_Keith) ks***@mib.org <http://www.ghoti.net/~kst>
San Diego Supercomputer Center <*> <http://users.sdsc.edu/~kst>
We must do something. This is something. Therefore, we must do this.
Nov 14 '05 #36

P: n/a
Old Wolf <ol*****@inspire.net.nz> scribbled the following:
Jack Klein wrote:
Let's just assume ASCII for a moment.

Consider that the only differences between 'A', 65, 0x41,
and 081 exist syntactically in the way they are parsed
081 ?


Shouldn't that be 0101?

--
/-- Joona Palaste (pa*****@cc.helsinki.fi) ------------- Finland --------\
\-------------------------------------------------------- rules! --------/
"Bad things only happen to scoundrels."
- Moominmamma
Nov 14 '05 #37

This discussion thread is closed

Replies have been disabled for this discussion.