In article <f5**********@t di.cu.mi.it>, Army1987 <pl********@for .itwrote:
> for (i = 0; i != sizeof a; ++i) {
>Any reason to use != where the rest of the world uses <?
There is a theory that it's better to use "<", because if the variable
somehow gets to be bigger than the terminating value, the loop will
still stop. I believe this is sometimes considered to be "defensive
programming".
There is another theory that this is a really bad idea, because it will
hide bugs in your program (how did the variable get the bogus value?),
and you should instead use "!=" to make the error get noticed sooner.
From the point of view of readability, I think that "<" is more likely to
express the way the programmer is thinking about it - the terminating
value expresses the end of a range, rather than a sentinel value as
is "while(*p++ != '\0')".
So using "!=" seems to me to be in the same class of idioms as writing
"if(1 == a)" rather than "if(a == 1)" - it might sometimes result in
earlier error detection, but at the expense of naturalness and hence
readability.
-- Richard
--
"Considerat ion shall be given to the need for as many as 32 characters
in some alphabets" - X3.4, 1963.