By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
428,631 Members | 826 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 428,631 IT Pros & Developers. It's quick & easy.

Opinion) Overuse of symbolic constants

P: n/a
Right from the time the first edition of K&R was released, the
advantages of using symbolic constants, as opposed to "magic numbers",
has been emphasized ---- and for good reason. I don't dispute that at
all. However, it gets on my nerves when people carry this practice
too far. Consider these examples (from the code written by a
distinguished colleague):

#define DASH '-'
#define SLASH '/'
#define SINGLE_BYTE 1

It is one thing to use symbolic constants with meaningful (and in some
cases, abstract) names, but methinks that use of symbolic constants in
this way is a complete waste. Let us take the first example. Either
the definition of DASH will never change (in which case it's usage is
superfluous) or the definition of DASH will change in the future (in
which case it will be completely misleading).
Any opinions?

--SS
Jul 22 '05 #1
Share this Question
Share on Google+
35 Replies


P: n/a
> Right from the time the first edition of K&R was released, the
advantages of using symbolic constants, as opposed to "magic numbers",
has been emphasized ---- and for good reason. I don't dispute that at
all. However, it gets on my nerves when people carry this practice
too far. Consider these examples (from the code written by a
distinguished colleague):

#define DASH '-'
#define SLASH '/'
#define SINGLE_BYTE 1

It is one thing to use symbolic constants with meaningful (and in some
cases, abstract) names, but methinks that use of symbolic constants in
this way is a complete waste. Let us take the first example. Either
the definition of DASH will never change (in which case it's usage is
superfluous) or the definition of DASH will change in the future (in
which case it will be completely misleading).


I consider three important uses for defines or constants:

1. For values that are subject to change.
2. To distinguish a certain usage of a value from other usages, so that
searches are more efficient.
3. More meaningfull names (e.g. PATH_NOT_FOUND instead of a value).

I do not think that the examples shown above fits into any of these usages.

Niels Dybdahl
Jul 22 '05 #2

P: n/a


Sandeep Sharma wrote:

Right from the time the first edition of K?R was released, the
advantages of using symbolic constants, as opposed to "magic numbers",
has been emphasized ---- and for good reason. I don't dispute that at
all. However, it gets on my nerves when people carry this practice
too far. Consider these examples (from the code written by a
distinguished colleague):

#define DASH '-'
#define SLASH '/'
#define SINGLE_BYTE 1

It is one thing to use symbolic constants with meaningful (and in some
cases, abstract) names, but methinks that use of symbolic constants in
this way is a complete waste. Let us take the first example. Either
the definition of DASH will never change (in which case it's usage is
superfluous) or the definition of DASH will change in the future (in
which case it will be completely misleading).

Any opinions?

--SS


I use the SLASH definition to distinguish the directory name separator
for Unix vs. Windows:

#ifdef WIN32
#define SLASH '\\'
#else
#define SLASH '/'
#endif
--
Fred L. Kleinschmidt
Boeing Associate Technical Fellow
Technical Architect, Common User Interface Services
M/S 2R-94 (206)544-5225
Jul 22 '05 #3

P: n/a
On Wed, 21 Apr 2004, Sandeep Sharma wrote:
Right from the time the first edition of K&R was released, the
advantages of using symbolic constants, as opposed to "magic numbers",
has been emphasized ---- and for good reason. I don't dispute that at
all. However, it gets on my nerves when people carry this practice
too far. Consider these examples (from the code written by a
distinguished colleague):

#define DASH '-'
#define SLASH '/'
#define SINGLE_BYTE 1

It is one thing to use symbolic constants with meaningful (and in some
cases, abstract) names, but methinks that use of symbolic constants in
this way is a complete waste. Let us take the first example. Either
the definition of DASH will never change (in which case it's usage is
superfluous) or the definition of DASH will change in the future (in
which case it will be completely misleading).

Any opinions?


I use macros for data that is subject to change or to make the code more
readable. Symbols are just as readable as words so I see no point in two
of the three examples.

However, the SLASH might have a place. I have seen code like:

#ifdef WINDOWS
#define SEPERATOR '\\'
#else
#define SEPERATOR '/'
#endif

Have you asked the distinguished collegue why the need for the macros? If
they have a good reason maybe a comment in the source code would be
helpful.

I'm guessing they are just blindly following something they were taught
without understanding why.

--
Send e-mail to: darrell at cs dot toronto dot edu
Don't send e-mail to vi************@whitehouse.gov
Jul 22 '05 #4

P: n/a
"Fred L. Kleinschmidt" <fred.l.kleinschmidt@nospam_boeing.com> writes:
Sandeep Sharma wrote:

Right from the time the first edition of K?R was released, the
advantages of using symbolic constants, as opposed to "magic numbers",
has been emphasized ---- and for good reason. I don't dispute that at
all. However, it gets on my nerves when people carry this practice
too far. Consider these examples (from the code written by a
distinguished colleague):

#define DASH '-'
#define SLASH '/'
#define SINGLE_BYTE 1

It is one thing to use symbolic constants with meaningful (and in some
cases, abstract) names, but methinks that use of symbolic constants in
this way is a complete waste. Let us take the first example. Either
the definition of DASH will never change (in which case it's usage is
superfluous) or the definition of DASH will change in the future (in
which case it will be completely misleading).

Any opinions?


I use the SLASH definition to distinguish the directory name separator
for Unix vs. Windows:

#ifdef WIN32
#define SLASH '\\'
#else
#define SLASH '/'
#endif


I find this misleading. I'd prefer

#ifdef WIN32
#define DIR_SEP '\\'
#else
#define DIR_SEP '/'
#endif

Martin
--
,--. Martin Dickopp, Dresden, Germany ,= ,-_-. =.
/ ,- ) http://www.zero-based.org/ ((_/)o o(\_))
\ `-' `-'(. .)`-'
`-. Debian, a variant of the GNU operating system. \_/
Jul 22 '05 #5

P: n/a
In <Pi*******************************@drj.pf> da*****@NOMORESPAMcs.utoronto.ca.com (Darrell Grainger) writes:
However, the SLASH might have a place. I have seen code like:

#ifdef WINDOWS
#define SEPERATOR '\\'
#else
#define SEPERATOR '/'
#endif


It was probably written by someone ignoring both English and Windows.

In most contexts, Windows accepts the forward slash as path separator.
The only exception coming to mind is COMMAND.COM (newer command
interpreters are perfectly happy with Unix-style path specifications).

So, in a C context, it's only strings prepared to be passed to system()
that need the distinction. But such strings are typically affected by
much more important portability issues...

Dan
--
Dan Pop
DESY Zeuthen, RZ group
Email: Da*****@ifh.de
Jul 22 '05 #6

P: n/a
sa*********@yahoo.com (Sandeep Sharma) wrote in message news:<4c*************************@posting.google.c om>...
Right from the time the first edition of K&R was released, the
advantages of using symbolic constants, as opposed to "magic numbers",
has been emphasized ---- and for good reason. I don't dispute that at
all. However, it gets on my nerves when people carry this practice
too far. Consider these examples (from the code written by a
distinguished colleague):

#define DASH '-'
#define SLASH '/'
#define SINGLE_BYTE 1

It is one thing to use symbolic constants with meaningful (and in some
cases, abstract) names, but methinks that use of symbolic constants in
this way is a complete waste. Let us take the first example. Either
the definition of DASH will never change (in which case it's usage is
superfluous) or the definition of DASH will change in the future (in
which case it will be completely misleading).


I agree. To be useful, a symbolic name needs to be symbolic -- it
needs to symbolize something. These are roughly equivalent to
comments like:

a=b; /* assign b to a */

that only repeat what's already obvious. To be useful, the symbol
needs to add meaning that isn't obvious without it. A constrast would
be names like the following:

#define switch_char '-'
#define path_sep '/'

which really add meaning, as well as the flexibility of (for example)
allowing a path separator to be changed from '/' to '\\' to ':' as
appropriate.
Later,
Jerry.

--
The universe is a figment of its own imagination.
Jul 22 '05 #7

P: n/a

On Wed, 21 Apr 2004, Sandeep Sharma wrote in comp.lang.c:

Right from the time the first edition of K&R was released, the
advantages of using symbolic constants, as opposed to "magic numbers",
has been emphasized ---- and for good reason. I don't dispute that at
all. However, it gets on my nerves when people carry this practice
too far. Consider these examples (from the code written by a
distinguished colleague):

#define DASH '-'
#define SLASH '/'
#define SINGLE_BYTE 1


One possibility I haven't seen anyone mention yet is that while the
abstract *value* of "DASH" will never change, the *type* of "DASH"
may well change: for example, when updating this code to deal with
wide character I/O we write

#define DASH L'-'
#define SLASH L'/'
#define SINGLE_BYTE (sizeof (wchar_t))

(Here SINGLE_BYTE is a misnomer; better to name it SINGLE_CHAR or
something similar.)

A second possibility is that the first couple entries were taken
out of context from a hand-written lexer or parser:

#define DASH '-'
#define SLASH '/'
#define ASSGNOP 1000
#define STAR_ASSGNOP 1001
#define DASH_ASSGNOP 1002
#define PLUSPLUS 1003
[...]

Here we are simplifying the lexer code by making the "dash" token
equal in value to the system's '-' character, and similarly for all
other one-character tokens: multi-character tokens get their own
numerical "token" values, out of the ASCII range.
(This example code is obviously not quite portable; it's implicitly
assuming that '-' and 1000 are distinct values, which is guaranteed
by ASCII but not by Standard C.)

The already-mentioned "platform-independent directory separator"
idea is a common one, too, but I don't see how it explains the presence
of DASH along with SLASH.

HTH,
-Arthur
Jul 22 '05 #8

P: n/a
"Niels Dybdahl" <nd*@fjern.detteesko-graphics.com> wrote:
Consider these examples (from the code written by a
distinguished colleague):

#define DASH '-'
#define SLASH '/'
#define SINGLE_BYTE 1

It is one thing to use symbolic constants with meaningful (and in some
cases, abstract) names, but methinks that use of symbolic constants in
this way is a complete waste. Let us take the first example. Either
the definition of DASH will never change (in which case it's usage is
superfluous) or the definition of DASH will change in the future (in
which case it will be completely misleading).


I consider three important uses for defines or constants:

1. For values that are subject to change.
2. To distinguish a certain usage of a value from other usages, so that
searches are more efficient.
3. More meaningfull names (e.g. PATH_NOT_FOUND instead of a value).

I do not think that the examples shown above fits into any of these usages.


From a C99 implementation near you:

#define and &&
#define and_eq &=
#define bitand &
#define bitor |
#define compl ~
#define not !
#define not_eq !=
#define or ||
#define or_eq |=
#define xor ^
#define xor_eq ^=

This is interesting because a co-developer of mine has a standard include
file (predating C99 by a long way) which has:

#define AND &&
#define OR ||
#define NOT !

So I guess there is a fourth category, which the OP's definitions might
fall into: improving readability.
Jul 22 '05 #9

P: n/a
Old Wolf wrote:
.... snip ...
From a C99 implementation near you:

#define and &&
#define and_eq &=
#define bitand &
#define bitor |
#define compl ~
#define not !
#define not_eq !=
#define or ||
#define or_eq |=
#define xor ^
#define xor_eq ^=


This has been in C90 since about 1995. #include <iso646.h>

--
A: Because it fouls the order in which people normally read text.
Q: Why is top-posting such a bad thing?
A: Top-posting.
Q: What is the most annoying thing on usenet and in e-mail?
Jul 22 '05 #10

P: n/a
In <40***************@yahoo.com> CBFalconer <cb********@yahoo.com> writes:
Old Wolf wrote:

... snip ...

From a C99 implementation near you:

#define and &&
#define and_eq &=
#define bitand &
#define bitor |
#define compl ~
#define not !
#define not_eq !=
#define or ||
#define or_eq |=
#define xor ^
#define xor_eq ^=


This has been in C90 since about 1995. #include <iso646.h>


More correctly, this has been in C94 since about 1995. I don't know
if C94 implementations were popular enough to be worth messing with
<iso646.h> if you cared about portable programming.

Dan
--
Dan Pop
DESY Zeuthen, RZ group
Email: Da*****@ifh.de
Jul 22 '05 #11

P: n/a
"Fred L. Kleinschmidt" wrote:
[...]
I use the SLASH definition to distinguish the directory name separator
for Unix vs. Windows:

#ifdef WIN32
#define SLASH '\\'
#else
#define SLASH '/'
#endif


Why bother? Unless you are building a filename to pass to system(),
and that specific command doesn't like '/', the above is superfluous.
(That, and misleading by using the name SLASH rather than something
like PATH_SET.)

All versions of MS-Windows, and all versions of MS-DOS back to 2.0
when they first allowed subdirectories can use the following:

FILE *f = fopen("/foo/bar/foobar.txt","r");

And even system() works if you're not exectuting a brain-dead command.
For example:

system("vi /foo/bar/foobar.txt");

It's amazing how many code samples I see that have code to build path
names differently under DOS/Windows, when these are used strictly for
internal purposes and not for system commands. And it's amazing how
many people are shocked when asked "since when?" and I tell them
"since always".

--
+-------------------------+--------------------+-----------------------------+
| Kenneth J. Brody | www.hvcomputer.com | |
| kenbrody at spamcop.net | www.fptech.com | #include <std_disclaimer.h> |
+-------------------------+--------------------+-----------------------------+
Jul 22 '05 #12

P: n/a
"Old Wolf" <ol*****@inspire.net.nz> wrote in message
news:84**************************@posting.google.c om...
"Niels Dybdahl" <nd*@fjern.detteesko-graphics.com> wrote:
Consider these examples (from the code written by a
distinguished colleague):

#define DASH '-'
#define SLASH '/'
#define SINGLE_BYTE 1

It is one thing to use symbolic constants with meaningful (and in some
cases, abstract) names, but methinks that use of symbolic constants in
this way is a complete waste. Let us take the first example. Either
the definition of DASH will never change (in which case it's usage is
superfluous) or the definition of DASH will change in the future (in
which case it will be completely misleading).
I consider three important uses for defines or constants:

1. For values that are subject to change.
2. To distinguish a certain usage of a value from other usages, so that
searches are more efficient.
3. More meaningfull names (e.g. PATH_NOT_FOUND instead of a value).

I do not think that the examples shown above fits into any of these

usages.
From a C99 implementation near you:

#define and &&
#define and_eq &=
#define bitand &
#define bitor |
#define compl ~
#define not !
#define not_eq !=
#define or ||
#define or_eq |=
#define xor ^
#define xor_eq ^=

This is interesting because a co-developer of mine has a standard include
file (predating C99 by a long way) which has:

#define AND &&
#define OR ||
#define NOT !

So I guess there is a fourth category, which the OP's definitions might
fall into: improving readability.


The point was not primarily readabilty, but rather to enable people to
use those operators if their keyboard either did not support some/all of
those characters, or if it was unconviniently difficult to type them.

regards
--
jb

(replace y with x if you want to reply by e-mail)
Jul 22 '05 #13

P: n/a
Kenneth Brody <ke******@spamcop.net> writes:
"Fred L. Kleinschmidt" wrote:
[...]
I use the SLASH definition to distinguish the directory name separator
for Unix vs. Windows:

#ifdef WIN32
#define SLASH '\\'
#else
#define SLASH '/'
#endif


Why bother? Unless you are building a filename to pass to system(),
and that specific command doesn't like '/', the above is superfluous.
(That, and misleading by using the name SLASH rather than something
like PATH_SET.)

All versions of MS-Windows, and all versions of MS-DOS back to 2.0
when they first allowed subdirectories can use the following:

FILE *f = fopen("/foo/bar/foobar.txt","r");

And even system() works if you're not exectuting a brain-dead command.
For example:

system("vi /foo/bar/foobar.txt");


Disclaimer: I don't write programs for Windows (except under Cygwin,
but that doesn't count in this context.)

If a file name is going to be displayed to a Windows end user, the
user is likely to be confused if the path name contains '/' characters
rather than '\' characters. Confusing users is seldom a good idea.

--
Keith Thompson (The_Other_Keith) ks***@mib.org <http://www.ghoti.net/~kst>
San Diego Supercomputer Center <*> <http://users.sdsc.edu/~kst>
Schroedinger does Shakespeare: "To be *and* not to be"
Jul 22 '05 #14

P: n/a
Jakob Bieling wrote:
"Old Wolf" <ol*****@inspire.net.nz> wrote in message
news:84**************************@posting.google.c om...
"Niels Dybdahl" <nd*@fjern.detteesko-graphics.com> wrote:
Consider these examples (from the code written by a
distinguished colleague):

#define DASH '-'
#define SLASH '/'
#define SINGLE_BYTE 1

It is one thing to use symbolic constants with meaningful (and in some
cases, abstract) names, but methinks that use of symbolic constants in
this way is a complete waste. Let us take the first example. Either
the definition of DASH will never change (in which case it's usage is
superfluous) or the definition of DASH will change in the future (in
which case it will be completely misleading).

I consider three important uses for defines or constants:

1. For values that are subject to change.
2. To distinguish a certain usage of a value from other usages, so that
searches are more efficient.
3. More meaningfull names (e.g. PATH_NOT_FOUND instead of a value).

I do not think that the examples shown above fits into any of these


usages.
From a C99 implementation near you:

#define and &&
#define and_eq &=
#define bitand &
#define bitor |
#define compl ~
#define not !
#define not_eq !=
#define or ||
#define or_eq |=
#define xor ^
#define xor_eq ^=

This is interesting because a co-developer of mine has a standard include
file (predating C99 by a long way) which has:

#define AND &&
#define OR ||
#define NOT !

So I guess there is a fourth category, which the OP's definitions might
fall into: improving readability.

The point was not primarily readabilty, but rather to enable people to
use those operators if their keyboard either did not support some/all of
those characters, or if it was unconviniently difficult to type them.

I suppose not. This business of abusing the preprocessor is
typically done by newbies at play. Have you ever seen any code by a
professional programmer (someone who gets paid for it) using this
kind of stuff? I have not.

--
Joe Wright mailto:jo********@comcast.net
"Everything should be made as simple as possible, but not simpler."
--- Albert Einstein ---
Jul 22 '05 #15

P: n/a
Joe Wright wrote:
Jakob Bieling wrote:
"Old Wolf" <ol*****@inspire.net.nz> wrote in message
news:84**************************@posting.google.c om...
"Niels Dybdahl" <nd*@fjern.detteesko-graphics.com> wrote:

> Consider these examples (from the code written by a
> distinguished colleague):
>
> #define DASH '-'
> #define SLASH '/'
> #define SINGLE_BYTE 1
>

[SNIP]
From a C99 implementation near you:

#define and &&
#define and_eq &=
#define bitand &
#define bitor |
#define compl ~
#define not !
#define not_eq !=
#define or ||
#define or_eq |=
#define xor ^
#define xor_eq ^=

[SNIP]

The point was not primarily readabilty, but rather to enable
people to
use those operators if their keyboard either did not support some/all of
those characters, or if it was unconviniently difficult to type them.

I suppose not. This business of abusing the preprocessor is typically
done by newbies at play. Have you ever seen any code by a professional
programmer (someone who gets paid for it) using this kind of stuff? I
have not.


As a matter of fact, the first time I saw code for what was
to become the Bourne Shell (circa 1982), it was written exactly like
that except all the #defines were in CAPS.

--
"It is impossible to make anything foolproof
because fools are so ingenious"
- A. Bloch
Jul 22 '05 #16

P: n/a
"Kenneth Brody" <ke******@spamcop.net> wrote in message
news:40***************@spamcop.net...
All versions of MS-Windows, and all versions of MS-DOS back to 2.0
when they first allowed subdirectories can use the following:

FILE *f = fopen("/foo/bar/foobar.txt","r");

And even system() works if you're not exectuting a brain-dead command.
For example:

system("vi /foo/bar/foobar.txt");

It's amazing how many code samples I see that have code to build path
names differently under DOS/Windows, when these are used strictly for
internal purposes and not for system commands. And it's amazing how
many people are shocked when asked "since when?" and I tell them
"since always".


This was true even for the DOS CLI prior to 5.0 when MS appropriated the
forward slash for command options instead of following unix's dash
convention. While a forward slash may work in most API calls and many apps
today, that doesn't mean MS will continue to support it tomorrow, so using a
backslash is "safer" even if it's not quite as readable ('\\' vs '/') in C
source.

S

--
Stephen Sprunk "Stupid people surround themselves with smart
CCIE #3723 people. Smart people surround themselves with
K5SSS smart people who disagree with them." --Aaron Sorkin

Jul 22 '05 #17

P: n/a
Nick Landsberg wrote:
Joe Wright wrote:
Jakob Bieling wrote:
"Old Wolf" <ol*****@inspire.net.nz> wrote in message
news:84**************************@posting.google.c om...

"Niels Dybdahl" <nd*@fjern.detteesko-graphics.com> wrote:

>> Consider these examples (from the code written by a
>> distinguished colleague):
>>
>> #define DASH '-'
>> #define SLASH '/'
>> #define SINGLE_BYTE 1
>>
[SNIP]

From a C99 implementation near you:

#define and &&
#define and_eq &=
#define bitand &
#define bitor |
#define compl ~
#define not !
#define not_eq !=
#define or ||
#define or_eq |=
#define xor ^
#define xor_eq ^=

[SNIP]

The point was not primarily readabilty, but rather to enable
people to
use those operators if their keyboard either did not support some/all of
those characters, or if it was unconviniently difficult to type them.

I suppose not. This business of abusing the preprocessor is typically
done by newbies at play. Have you ever seen any code by a professional
programmer (someone who gets paid for it) using this kind of stuff? I
have not.


As a matter of fact, the first time I saw code for what was
to become the Bourne Shell (circa 1982), it was written exactly like
that except all the #defines were in CAPS.

Ok Nick, I'll assume that was a throwaway. I'm so old I've probably
seen it too and just forgot. Let me ask it another way..

Assuming you are a professional programmer, would you use these
#defines? If you were a teacher, would you recommend them to your
students?

if (A and B or C and D) {} doesn't look like C to me.
--
Joe Wright mailto:jo********@comcast.net
"Everything should be made as simple as possible, but not simpler."
--- Albert Einstein ---
Jul 22 '05 #18

P: n/a
Joe Wright wrote:
Nick Landsberg wrote:
Joe Wright wrote:
Jakob Bieling wrote:

"Old Wolf" <ol*****@inspire.net.nz> wrote in message
news:84**************************@posting.google.c om...

> "Niels Dybdahl" <nd*@fjern.detteesko-graphics.com> wrote:
>
>>> Consider these examples (from the code written by a
>>> distinguished colleague):
>>>
>>> #define DASH '-'
>>> #define SLASH '/'
>>> #define SINGLE_BYTE 1
>>>


[SNIP]

> From a C99 implementation near you:
>
> #define and &&
> #define and_eq &=
> #define bitand &
> #define bitor |
> #define compl ~
> #define not !
> #define not_eq !=
> #define or ||
> #define or_eq |=
> #define xor ^
> #define xor_eq ^=
>


[SNIP]

The point was not primarily readabilty, but rather to enable
people to
use those operators if their keyboard either did not support
some/all of
those characters, or if it was unconviniently difficult to type them.

I suppose not. This business of abusing the preprocessor is typically
done by newbies at play. Have you ever seen any code by a
professional programmer (someone who gets paid for it) using this
kind of stuff? I have not.


As a matter of fact, the first time I saw code for what was
to become the Bourne Shell (circa 1982), it was written exactly like
that except all the #defines were in CAPS.

Ok Nick, I'll assume that was a throwaway. I'm so old I've probably seen
it too and just forgot. Let me ask it another way..

Assuming you are a professional programmer, would you use these
#defines? If you were a teacher, would you recommend them to your students?

if (A and B or C and D) {} doesn't look like C to me.


You're right, Joe... it don't look like C and it don't
smell like C and it don't taste like C. (Pardons
for the bad grammar but is was to make a point.)

The example *was* about preprocessor abuse and by
a respected developer within the company. Unfortunately,
no one could easily debug the code since we could not
read it as "C". (or easily add functionality to it
since we weren't sure of just about anything at that
point.)

Personally I wouldn't use them.

If I *were* a teacher of C, I would deduct points for
"excessive cuteness" if someone came up with them
(unless it were a quiz on how to abuse the preprocessor P)
--
"It is impossible to make anything foolproof
because fools are so ingenious"
- A. Bloch
Jul 22 '05 #19

P: n/a
"Joe Wright" <jo********@comcast.net> wrote in message
news:q5********************@comcast.com...
#define AND &&
#define OR ||
#define NOT ! So I guess there is a fourth category, which the OP's definitions might
fall into: improving readability.
The point was not primarily readabilty, but rather to enable people to use those operators if their keyboard either did not support some/all of
those characters, or if it was unconviniently difficult to type them.
I suppose not. This business of abusing the preprocessor is
typically done by newbies at play. Have you ever seen any code by a
professional programmer (someone who gets paid for it) using this
kind of stuff? I have not.


Me neither, as I have not worked with people who use a keyboard that
does not allow easy access to those characters, and I am sure you have not
either. I did not say that it is commmon practice nor that it should be
encouraged in any way. All I am saying is, that those things primarily exist
to enable people who do not have easy access to those characters, to use
them anyway.

regards
--
jb

(replace y with x if you want to reply by e-mail)
Jul 22 '05 #20

P: n/a
sa*********@yahoo.com (Sandeep Sharma) wrote in message news:<4c*************************@posting.google.c om>...
Right from the time the first edition of K&R was released, the
advantages of using symbolic constants, as opposed to "magic numbers",
has been emphasized ---- and for good reason. I don't dispute that at
all. However, it gets on my nerves when people carry this practice
too far. Consider these examples (from the code written by a
distinguished colleague):

#define DASH '-'
#define SLASH '/'
#define SINGLE_BYTE 1

It is one thing to use symbolic constants with meaningful (and in some
cases, abstract) names, but methinks that use of symbolic constants in
this way is a complete waste. Let us take the first example. Either
the definition of DASH will never change (in which case it's usage is
superfluous) or the definition of DASH will change in the future (in
which case it will be completely misleading).
Any opinions?

--SS


Apart from the fact that using the preprocessor in this context is
horrible, the above makes sense if the code could eventually be ported
to unicode.
In this case the code would look something like:

stdchar const dash(CHAR_T('-'));

where CHART would be a preprocesser define:

#if defined(WIDE_CHAR_USE)
# define CHAR_T(_x) L#x
#else
# define CHAR_T(_x) x
#endif

Regards
Peter
Jul 22 '05 #21

P: n/a
Stephen Sprunk wrote:

"Kenneth Brody" <ke******@spamcop.net> wrote in message [...]
It's amazing how many code samples I see that have code to build path
names differently under DOS/Windows, when these are used strictly for
internal purposes and not for system commands. And it's amazing how
many people are shocked when asked "since when?" and I tell them
"since always".


This was true even for the DOS CLI prior to 5.0 when MS appropriated the
forward slash for command options instead of following unix's dash
convention.


Well "since always" includes "prior to 5.0". ;-)

The problem goes all the way back to MS-DOS 1.0 using slashes for command-
line flags. Then, when 2.0 allowed subdirectories, they were stuck.
While a forward slash may work in most API calls and many apps
today, that doesn't mean MS will continue to support it tomorrow, so using a
backslash is "safer" even if it's not quite as readable ('\\' vs '/') in C
source.


I don't think that even MS has the balls to break that compatibility.

--
+-------------------------+--------------------+-----------------------------+
| Kenneth J. Brody | www.hvcomputer.com | |
| kenbrody at spamcop.net | www.fptech.com | #include <std_disclaimer.h> |
+-------------------------+--------------------+-----------------------------+

Jul 22 '05 #22

P: n/a
Kenneth Brody <ke******@spamcop.net> scribbled the following
on comp.lang.c:
Stephen Sprunk wrote:
While a forward slash may work in most API calls and many apps
today, that doesn't mean MS will continue to support it tomorrow, so using a
backslash is "safer" even if it's not quite as readable ('\\' vs '/') in C
source.
I don't think that even MS has the balls to break that compatibility.


They're MS. They'd rather see UNIX adopt \ as a directory separator.
Erm, wait, scratch that. They'd rather not see UNIX at all.

--
/-- Joona Palaste (pa*****@cc.helsinki.fi) ------------- Finland --------\
\-- http://www.helsinki.fi/~palaste --------------------- rules! --------/
"You have moved your mouse, for these changes to take effect you must shut down
and restart your computer. Do you want to restart your computer now?"
- Karri Kalpio
Jul 22 '05 #23

P: n/a
In <c6*************@news.t-online.com> "Jakob Bieling" <ne*****@gmy.net> writes:
"Joe Wright" <jo********@comcast.net> wrote in message
news:q5********************@comcast.com...
>>#define AND &&
>>#define OR ||
>>#define NOT ! >>So I guess there is a fourth category, which the OP's definitions might
>>fall into: improving readability. > The point was not primarily readabilty, but rather to enable peopleto > use those operators if their keyboard either did not support some/all of
> those characters, or if it was unconviniently difficult to type them.

I suppose not. This business of abusing the preprocessor is
typically done by newbies at play. Have you ever seen any code by a
professional programmer (someone who gets paid for it) using this
kind of stuff? I have not.


Me neither, as I have not worked with people who use a keyboard that
does not allow easy access to those characters, and I am sure you have not
either. I did not say that it is commmon practice nor that it should be
encouraged in any way. All I am saying is, that those things primarily exist
to enable people who do not have easy access to those characters, to use
them anyway.


The good question is whether such people do exist. 20 years ago, they
did, but back then there was no <iso646.h>...

Dan
--
Dan Pop
DESY Zeuthen, RZ group
Email: Da*****@ifh.de
Jul 22 '05 #24

P: n/a
In <2q********************@bgtnsc05-news.ops.worldnet.att.net> Nick Landsberg <hu*****@NOSPAM.att.net> writes:
As a matter of fact, the first time I saw code for what was
to become the Bourne Shell (circa 1982), it was written exactly like
that except all the #defines were in CAPS.


The Bourne Shell is older than that (it was the shell of Unix V7, released
at about the same time as K&R1) and it got these macros (and others, a lot
uglier, to hide the C's execution control syntax) by including "algol.h".

The result was something no one ever wanted to maintain...

Dan
--
Dan Pop
DESY Zeuthen, RZ group
Email: Da*****@ifh.de
Jul 22 '05 #25

P: n/a
In <73******************************@news.teranews.co m> "Stephen Sprunk" <st*****@sprunk.org> writes:
"Kenneth Brody" <ke******@spamcop.net> wrote in message
news:40***************@spamcop.net...
All versions of MS-Windows, and all versions of MS-DOS back to 2.0
when they first allowed subdirectories can use the following:

FILE *f = fopen("/foo/bar/foobar.txt","r");

And even system() works if you're not exectuting a brain-dead command.
For example:

system("vi /foo/bar/foobar.txt");

It's amazing how many code samples I see that have code to build path
names differently under DOS/Windows, when these are used strictly for
internal purposes and not for system commands. And it's amazing how
many people are shocked when asked "since when?" and I tell them
"since always".


This was true even for the DOS CLI prior to 5.0 when MS appropriated the
forward slash for command options instead of following unix's dash
convention. While a forward slash may work in most API calls and many apps
today, that doesn't mean MS will continue to support it tomorrow, so using a
backslash is "safer" even if it's not quite as readable ('\\' vs '/') in C
source.


As far as I know, ever since MSDOS 2.0, MS OSs used / internally and \
only in the user interface. It's a bit late for them to change...

Dan
--
Dan Pop
DESY Zeuthen, RZ group
Email: Da*****@ifh.de
Jul 22 '05 #26

P: n/a
"Dan Pop" <Da*****@cern.ch> wrote in message
news:c6**********@sunnews.cern.ch...
encouraged in any way. All I am saying is, that those things primarily existto enable people who do not have easy access to those characters, to use
them anyway.


The good question is whether such people do exist. 20 years ago, they
did, but back then there was no <iso646.h>...


Good point .. and as I have no knowledge about any other type of
keyboard other than the German and the US layout, I cannot tell if there is
not some kind of keyboard that does not have those keys. Maybe for a
different system, other than IBM compatible, that has a completely different
keyboard, but still has a C++ compiler .. just speculating, tho. But if it
does exist somewhere out there, those guys and girls using it sure will be
glad to have iso464.h ;)

regards
--
jb

(replace y with x if you want to reply by e-mail)
Jul 22 '05 #27

P: n/a
In article <40***************@spamcop.net>,
Kenneth Brody <ke******@spamcop.net> wrote:
Stephen Sprunk wrote:
The problem goes all the way back to MS-DOS 1.0 using slashes for command-
line flags. Then, when 2.0 allowed subdirectories, they were stuck.


Come on, be fair.. The problem started before MS-DOS, since CP/M used
'/' for options before MS-DOS even existed. And CP/M got it from various
DEC OS's like RSTS and RT11 before that. I don't know if DEC introduced
the idea, or if they got it from somewhere previous, but by now we'er
talking about earlier then Unix V6, so DEC can hardly be faulted for not
following "the one true way".

Marcus Hall
ma****@tuells.org
Jul 22 '05 #28

P: n/a
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

marcus hall wrote:
| In article <40***************@spamcop.net>,
| Kenneth Brody <ke******@spamcop.net> wrote:
|
|>Stephen Sprunk wrote:
|>The problem goes all the way back to MS-DOS 1.0 using slashes for command-
|>line flags. Then, when 2.0 allowed subdirectories, they were stuck.
|
|
| Come on, be fair.. The problem started before MS-DOS, since CP/M used
| '/' for options before MS-DOS even existed. And CP/M got it from various
| DEC OS's like RSTS and RT11 before that. I don't know if DEC introduced
| the idea, or if they got it from somewhere previous, but by now we'er
| talking about earlier then Unix V6, so DEC can hardly be faulted for not
| following "the one true way".
|
| Marcus Hall
| ma****@tuells.org

Just thumbing through my CP/M manual, I have to say I don't notice any
forward slashes mentioned anywhere (apart from in the name of the OS
:-). Seems all the standard CP/M utilities like their options in square
brackets...
That's not to say that the practice wasn't used in 3rd party CP/M
programs, just not in the system itself.

Ross
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.2.3 (GNU/Linux)
Comment: Using GnuPG with Thunderbird - http://enigmail.mozdev.org

iD8DBQFAkuB79bR4xmappRARAiDoAJ9v5I6Bt8mBa7/MuioNpFvXQ0rLAwCePwNb
PXPOrDynSuH19OjELKezPuo=
=gGte
-----END PGP SIGNATURE-----
Jul 22 '05 #29

P: n/a
"Ross Kendall Axe" <ro******@blueyonder.co.uk> wrote
Hash: SHA1

marcus hall wrote:
| In article <40***************@spamcop.net>,
| Kenneth Brody <ke******@spamcop.net> wrote:
|
|>Stephen Sprunk wrote:
|>The problem goes all the way back to MS-DOS 1.0 using slashes for command-
|>line flags. Then, when 2.0 allowed subdirectories, they were stuck.
|
|
| Come on, be fair.. The problem started before MS-DOS, since CP/M used
| '/' for options before MS-DOS even existed. And CP/M got it from various
| DEC OS's like RSTS and RT11 before that. I don't know if DEC introduced
| the idea, or if they got it from somewhere previous, but by now we'er
| talking about earlier then Unix V6, so DEC can hardly be faulted for not
| following "the one true way".
|
| Marcus Hall
| ma****@tuells.org

Just thumbing through my CP/M manual, I have to say I don't notice any
forward slashes mentioned anywhere (apart from in the name of the OS
:-). Seems all the standard CP/M utilities like their options in square
brackets...
That's not to say that the practice wasn't used in 3rd party CP/M
programs, just not in the system itself.


Um... the square brackets indicated optional arguments. You didn't actually type
the brackets. :-)

Claudio Puviani
Jul 22 '05 #30

P: n/a
Claudio Puviani <pu*****@hotmail.com> scribbled the following
on comp.lang.c:
"Ross Kendall Axe" <ro******@blueyonder.co.uk> wrote
Just thumbing through my CP/M manual, I have to say I don't notice any
forward slashes mentioned anywhere (apart from in the name of the OS
:-). Seems all the standard CP/M utilities like their options in square
brackets...
That's not to say that the practice wasn't used in 3rd party CP/M
programs, just not in the system itself.
Um... the square brackets indicated optional arguments. You didn't actually type
the brackets. :-)


I'd bet a lot of users still tried to, though, and were surprised at why
it didn't work.
I've seen my fair share of Commodore 64 BASIC programs that begin by
printing "<CLR>" (literally, as in a less than sign, C, L, R, and a
greater than sign) to the screen, followed by the introductory text.
The writer of the program copied it from a magazine listing where
"<CLR>" was used as a transcript of the CLR control character, which
could be typed by hand on a Commodore 64 but which wasn't printed very
well on paper. The magazine was bound to have an introductory paragraph
about transcripts of control characters, that they were not to be typed
literally, but many readers ignored it when typing in the programs.
I was once in the Finnish science center Heureka and went over to a
computer running a cuneiform scripting program. The program displayed
a menu:
"Please select your language:
Finnish: type 1 + enter
Swedish: type 2 + enter
English: type 3 + enter"
At least one person *insisted* that the choice must be made by first
typing the digit 1, 2 or 3, then by typing the plus sign "+", and
finally by pressing enter.

--
/-- Joona Palaste (pa*****@cc.helsinki.fi) ------------- Finland --------\
\-- http://www.helsinki.fi/~palaste --------------------- rules! --------/
"The day Microsoft makes something that doesn't suck is probably the day they
start making vacuum cleaners."
- Ernst Jan Plugge
Jul 22 '05 #31

P: n/a
In article <xY**********************@news4.srv.hcvlny.cv.net> ,
Claudio Puviani <pu*****@hotmail.com> wrote:
Um... the square brackets indicated optional arguments. You didn't actually type
the brackets. :-)


Have you actually used CP/M? The CP/M manuals use curly braces to indicate
optional arguments. The arguments that you type in on the command line are
enclosed in square brackets or start with a $ sign (older utilities).

Examples from the "CP/M Plus Command Summary":

A>DEVICE LPT [XON,9600]
A>DEVICE CONSOLE [COLUMNS=40,LINES=16]
A>DIR [EXCLUDE] *.DAT
A>DIR [DRIVE=ALL USER=ALL] TESTFILE.BOB
A>MAC SAMPLE $PB AA HB SX -M
A>SET MYFILE.TEX [PASSWORD=MYFIL]
A>SET [DEFAULT=password]
A>SET [CREATE=ON,UPDATE=ON]

MicroSofts langauages for CP/M (F80, M80, L80...) did use the slash to
introduce program options, where they got this from I don't know (but
it certainly wasn't from CP/M).

--
Göran Larsson http://www.mitt-eget.com/
Jul 22 '05 #32

P: n/a
"Goran Larsson" <ho*@invalid.invalid> wrote
Claudio Puviani <pu*****@hotmail.com> wrote:
Um... the square brackets indicated optional arguments. You didn't actually type the brackets. :-)
Have you actually used CP/M?


Well, I used it regularly from 1980 to 1984 and I still have 3 functional CP/M
computers (not "other" computers with CP/M cartridges). Does that count?
The CP/M manuals use curly braces to indicate
optional arguments. The arguments that you type in on the command line are
enclosed in square brackets or start with a $ sign (older utilities).

Examples from the "CP/M Plus Command Summary":

A>DEVICE LPT [XON,9600]
A>DEVICE CONSOLE [COLUMNS=40,LINES=16]
A>DIR [EXCLUDE] *.DAT
A>DIR [DRIVE=ALL USER=ALL] TESTFILE.BOB
A>MAC SAMPLE $PB AA HB SX -M
A>SET MYFILE.TEX [PASSWORD=MYFIL]
A>SET [DEFAULT=password]
A>SET [CREATE=ON,UPDATE=ON]
CP/M Plus (or 3.0) was a too-little-too-late response from Digital Research after
CP/M had already lost the war. CP/M 2.2 is the version that had captured the
maket when there was a market to capture and it's still the baseline that all
CP/M software development targeted. Its commands were different from CP/M Plus,
as you can see at: http://www.gaby.de/cpm/manuals/archi...2htm/index.htm

You'll also note that the original documentation did use square brackets to
indicate optional arguments.
MicroSofts langauages for CP/M (F80, M80, L80...) did use the slash to
introduce program options, where they got this from I don't know (but
it certainly wasn't from CP/M).


Since CP/M itself didn't enforce any delimiters, programs used whatever they
wanted. Some used slashes, some use dashes, some used nothing at all.

Claudio Puviani
Jul 22 '05 #33

P: n/a
In article <YE**********************@news4.srv.hcvlny.cv.net> ,
Claudio Puviani <pu*****@hotmail.com> wrote:
Well, I used it regularly from 1980 to 1984 and I still have 3 functional CP/M
computers (not "other" computers with CP/M cartridges). Does that count?
Sure.
Since CP/M itself didn't enforce any delimiters, programs used whatever they
wanted. Some used slashes, some use dashes, some used nothing at all.


Yes, but the programs delivered with CP/M never used / for options.
Programs delivered with CP/M 2 prefer to use $ for options.
Programs delivered with CP/M 3 prefer to use [ and ] for options.
Programs sold by MicroSoft opted to use / for options.

Blaming CP/M for MicroSoft's use of / for options is wrong. The blame
must be somewhere else, probably DEC as MicroSoft used DECsystem 10
to cross compile their CP/M programs.

--
Göran Larsson http://www.mitt-eget.com/
Jul 22 '05 #34

P: n/a
"Goran Larsson" <ho*@invalid.invalid> wrote
Claudio Puviani <pu*****@hotmail.com> wrote:
Well, I used it regularly from 1980 to 1984 and I still have 3 functional CP/M computers (not "other" computers with CP/M cartridges). Does that count?


Sure.
Since CP/M itself didn't enforce any delimiters, programs used whatever they
wanted. Some used slashes, some use dashes, some used nothing at all.


Yes, but the programs delivered with CP/M never used / for options.
Programs delivered with CP/M 2 prefer to use $ for options.
Programs delivered with CP/M 3 prefer to use [ and ] for options.
Programs sold by MicroSoft opted to use / for options.

Blaming CP/M for MicroSoft's use of / for options is wrong. The blame
must be somewhere else, probably DEC as MicroSoft used DECsystem 10
to cross compile their CP/M programs.


You need to follow the thread more attentively. I never accused CP/M of anything.

Claudio Puviani

Jul 22 '05 #35

P: n/a


[Followups set to alt.folklore.computers, where this is topical. AFC
readers: this thread sprang out of a discussion of path separator
characters, and has now become a shouting match over where Microsoft
got the idea to use the forward slash as an option separator, and
then fell back on the backslash as a path separator.]
In article <Hx********@approve.se>, ho*@invalid.invalid (Goran Larsson) writes:
In article <YE**********************@news4.srv.hcvlny.cv.net> ,
Claudio Puviani <pu*****@hotmail.com> wrote:
Since CP/M itself didn't enforce any delimiters, programs used whatever they
wanted. Some used slashes, some use dashes, some used nothing at all.


Yes, but the programs delivered with CP/M never used / for options.
Programs delivered with CP/M 2 prefer to use $ for options.
Programs delivered with CP/M 3 prefer to use [ and ] for options.
Programs sold by MicroSoft opted to use / for options.

Blaming CP/M for MicroSoft's use of / for options is wrong. The blame
must be somewhere else, probably DEC as MicroSoft used DECsystem 10
to cross compile their CP/M programs.


"Blame" seems a bit harsh, since slash as an option separator was
already established by VMS when MS-DOS 1.0 came out.[*] Blame
Microsoft instead for a poor choice of path syntax in MS-DOS 2.0.
Since they already used the VMS syntax for options, they might as
well have adopted its syntax for paths, rather than trying for some
weird mix of VMSisms and half-assed Unixisms and ending up with two
crucial punctuation characters that many users (apparently) can't
reliably distinguish.

In any case, MS-DOS 1.0 was basically a rebranding of Seatle
Computer Systems' QDOS, wasn't it? Did QDOS use the slash as an
option delimiter? It might not have even been Microsoft's choice.

[*] I'm assuming here that VMS's use of slash as option separator
dates back earlier than MS-DOS 1.0. VMS 1.0 was released in 1978,
but I didn't use it myself before 1986, so I couldn't say.

--
Michael Wojcik mi************@microfocus.com

It's like being shot at in an airport with all those guys running
around throwing hand grenades. Certain people function better with
hand grenades coming from all sides than other people do when the
hand grenades are only coming from inside out.
-- Dick Selcer, coach of the Cinci Bengals
Jul 22 '05 #36

This discussion thread is closed

Replies have been disabled for this discussion.