By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
449,402 Members | 1,107 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 449,402 IT Pros & Developers. It's quick & easy.

sign of char literals in #if directive

P: n/a
How should a C++ preprocessor interpret char literals in #if directives?

From the mingw linits.h (the comment is in the original):

===code snippet begin===

#define SCHAR_MIN (-128)
#define SCHAR_MAX 127

#define UCHAR_MAX 255

/* TODO: Is this safe? I think it might just be testing the preprocessor,
* not the compiler itself... */
#if ('\x80' < 0)
#define CHAR_MIN SCHAR_MIN
#define CHAR_MAX SCHAR_MAX
#else
#define CHAR_MIN 0
#define CHAR_MAX UCHAR_MAX
#endif

===code snippet end===

From the Final Draft of the standard, I get that the signedness of char
is implementation defined and the char-signedness of the compiler
isn't requied do match that of the preprocessor.

===Final draft 16.1 -4-===
Whether the numeric value for these character literals matches the value
obtained when an identical characterliteral occurs in an expression (other
than within a #if or #elif directive) is implementation-defined.*
===draft snippet end===

So the standard (in the draft version, I assume the final is the same)
doesn't even give a hint: should the preprocessor treat '\x80' as -128
or 128?

I ask because I'm working on a C++ parser to be used for code
analysis and transformation tools and it needs to have its own
preprocssor.

Should I settle on signed? unsigned? Is there a de-facto-standard?
Or should I let the preprocessor decide from a pragma - such as:

#pragma kvernbitr preprocessor_char_signedness signed
Aug 30 '07 #1
Share this question for a faster answer!
Share on Google+

This discussion thread is closed

Replies have been disabled for this discussion.