ma**********@th ales-is.com writes:
I am porting some old code from Digital Unix to Linux using token
pasting, which is failing to compile (code simplified):
#define DEBUG(strg,v) printf("debugou tput: "##strg##". \n", v);
main()
{
int x = 5;
DEBUG("variable x is %d",x);
}
Compiler gives the following output:
b.c:6:1: pasting ""debugoutp ut: "" and ""variable x is %d"" does not
give a valid preprocessing token
b.c:6:1: pasting ""variable x is %d"" and "".\n"" does not give a
valid preprocessing token
I am trying to understand the concept of "valid preprocessing token",
but also would like to know how I can achieve what the code tries to
do
Token-pasting joins two tokens together; the result must be a single
valid token. If you join the tokens
"foo"
and
"bar"
you get
"foo""bar"
which is not valid as a single token (though it is valid as two
tokens, two consecutive string literals). Apparently the old compiler
on Digital Unix didn't enforce the modern rules.
But you don't *need* to use token-pasting here, since adjacent string
literals are merged anyway. (That's been true since the 1989 ANSI C
standard, so it's likely that will work under Digital Unix as well,
since token-pasting was also introduced by the 1989 ANSI C standard.)
--
Keith Thompson (The_Other_Keit h)
ks***@mib.org <http://www.ghoti.net/~kst>
San Diego Supercomputer Center <* <http://users.sdsc.edu/~kst>
"We must do something. This is something. Therefore, we must do this."
-- Antony Jay and Jonathan Lynn, "Yes Minister"