By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
425,719 Members | 1,036 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 425,719 IT Pros & Developers. It's quick & easy.

size_t problems

P: n/a
I am trying to compile as much code in 64 bit mode as
possible to test the 64 bit version of lcc-win.

The problem appears now that size_t is now 64 bits.

Fine. It has to be since there are objects that are more than 4GB
long.

The problem is, when you have in thousands of places

int s;

// ...
s = strlen(str) ;

Since strlen returns a size_t, we have a 64 bit result being
assigned to a 32 bit int.

This can be correct, and in 99.9999999999999999999999999%
of the cases the string will be smaller than 2GB...

Now the problem:

Since I warn each time a narrowing conversion is done (since
that could loose data) I end up with hundreds of warnings each time
a construct like int a = strlen(...) appears. This clutters
everything, and important warnings go lost.
I do not know how to get out of this problem. Maybe any of you has
a good idea? How do you solve this when porting to 64 bits?

jacob
Aug 29 '07 #1
Share this Question
Share on Google+
409 Replies


P: n/a

"jacob navia" <ja***@jacob.remcomp.frwrote in message
news:46**********************@news.orange.fr...
>I am trying to compile as much code in 64 bit mode as
possible to test the 64 bit version of lcc-win.

The problem appears now that size_t is now 64 bits.

Fine. It has to be since there are objects that are more than 4GB
long.

The problem is, when you have in thousands of places

int s;

// ...
s = strlen(str) ;

Since strlen returns a size_t, we have a 64 bit result being
assigned to a 32 bit int.

This can be correct, and in 99.9999999999999999999999999%
of the cases the string will be smaller than 2GB...

Now the problem:

Since I warn each time a narrowing conversion is done (since
that could loose data) I end up with hundreds of warnings each time
a construct like int a = strlen(...) appears. This clutters
everything, and important warnings go lost.
I do not know how to get out of this problem. Maybe any of you has
a good idea? How do you solve this when porting to 64 bits?
There's a very obvious answer to that one. As a compiler-writer, youa re in
a position to do it.

--
Free games and programming goodies.
http://www.personal.leeds.ac.uk/~bgy1mm

Aug 29 '07 #2

P: n/a
jacob navia wrote:
[... "64-bit compiler" with 64-bit size_t ...]
The problem is, when you have in thousands of places

int s;

// ...
s = strlen(str) ;

Since strlen returns a size_t, we have a 64 bit result being
assigned to a 32 bit int.

This can be correct, and in 99.9999999999999999999999999%
of the cases the string will be smaller than 2GB...

Now the problem:

Since I warn each time a narrowing conversion is done (since
that could loose data) I end up with hundreds of warnings each time
a construct like int a = strlen(...) appears. This clutters
everything, and important warnings go lost.

I do not know how to get out of this problem. Maybe any of you has
a good idea? How do you solve this when porting to 64 bits?
Well, some people will probably claim that those hundreds of warnings
are a good thing, as strlen() returns size_t and not int. However,
if you are bombarded with hundreds of such warnings, many people will
simply start ignoring all of the warnings, and the "real" ones will
be lost in the noise.

Perhaps a flag that says "only display the first N instances of this
warning"?

Perhaps you could make int 64 bits as well?

--
+-------------------------+--------------------+-----------------------+
| Kenneth J. Brody | www.hvcomputer.com | #include |
| kenbrody/at\spamcop.net | www.fptech.com | <std_disclaimer.h|
+-------------------------+--------------------+-----------------------+
Don't e-mail me at: <mailto:Th*************@gmail.com>

Aug 29 '07 #3

P: n/a
jacob navia <ja***@jacob.remcomp.frwrites:
I am trying to compile as much code in 64 bit mode as
possible to test the 64 bit version of lcc-win.

The problem appears now that size_t is now 64 bits.

Fine. It has to be since there are objects that are more than 4GB
long.

The problem is, when you have in thousands of places

int s;

// ...
s = strlen(str) ;

Since strlen returns a size_t, we have a 64 bit result being
assigned to a 32 bit int.
I'd suggest fixing the code that does this to use size_t instead
of int. size_t is correct. int is, at best, an approximation to
correct. We've just had a pretty long thread with Malcolm McLean
discussing this very topic; perhaps you should refer to that
thread, if you're not already aware of it.
--
char a[]="\n .CJacehknorstu";int putchar(int);int main(void){unsigned long b[]
={0x67dffdff,0x9aa9aa6a,0xa77ffda9,0x7da6aa6a,0xa6 7f6aaa,0xaa9aa9f6,0x11f6},*p
=b,i=24;for(;p+=!*p;*p/=4)switch(0[p]&3)case 0:{return 0;for(p--;i--;i--)case+
2:{i++;if(i)break;else default:continue;if(0)case 1:putchar(a[i&15]);break;}}}
Aug 29 '07 #4

P: n/a
jacob navia <ja***@jacob.remcomp.frwrites:
I am trying to compile as much code in 64 bit mode as
possible to test the 64 bit version of lcc-win.

The problem appears now that size_t is now 64 bits.

Fine. It has to be since there are objects that are more than 4GB
long.

The problem is, when you have in thousands of places

int s;

// ...
s = strlen(str) ;

Since strlen returns a size_t, we have a 64 bit result being
assigned to a 32 bit int.

This can be correct, and in 99.9999999999999999999999999%
of the cases the string will be smaller than 2GB...

Now the problem:

Since I warn each time a narrowing conversion is done (since
that could loose data) I end up with hundreds of warnings each time
a construct like int a = strlen(...) appears. This clutters
everything, and important warnings go lost.
I do not know how to get out of this problem. Maybe any of you has
a good idea? How do you solve this when porting to 64 bits?
Why didn't you get the same warnings in 32-bit mode? If int and
size_t are both 32 bits, INT_MAX < SIZE_MAX, and there are values of
size_t that cannot be stored in an int. If the "narrowing conversion"
warning is based on the sizes of the type rather than the ranges, I'd
say you've just discovered a compiler bug.

If you're getting hundreds of warnings, it's because you have hundreds
of instances of potential loss of information.

Note that a conversion to a signed type of a value that doesn't fit in
that type yields an implementation-defined result (or, in C99, raises
an implementation-defined signal). In theory, the result could be
more than just a loss of information.

The problem is to distinguish cases where the conversion can't
actually overflow at execution times from the cases where it can.

Sufficiently clever dataflow analysis in the compiler might eliminate
some of the errors. If, given
int s = strlen(str);
the compiler knows enough about how the value of str that it can be
sure it's no longer than INT_MAX bytes, it can eliminate the warning.
But I don't know if it's practical, or even possible to eliminate
enough of the warnings this way. Doing this in most cases is hard;
doing it in all cases might be equivalent to solving the halting
problem. (That latter is only a guess.)

(Making int 64 bits won't solve the problem, since INT_MAX will still
be less than SIZE_MAX.)

You can filter the compiler's output to eliminate warnings about
narrowing implicit conversions (or, if available, use a compiler
option to turn off that particular warning), but that could miss cases
that could actually overflow.

In my opinion, the warnings are legitimate. The ideal solution is not
to suppress them, but to fix the code, assigning the result of
strlen() to a size_t rather than to an int. (Or I suppose you could
use a cast to shut up the compiler if you're *certain* the result can
never exceed INT_MAX, but that's not what I'd do.)

By compiling the code in 64-bit mode, you've discovered a number of
dormant bugs in the code.

--
Keith Thompson (The_Other_Keith) ks***@mib.org <http://www.ghoti.net/~kst>
San Diego Supercomputer Center <* <http://users.sdsc.edu/~kst>
"We must do something. This is something. Therefore, we must do this."
-- Antony Jay and Jonathan Lynn, "Yes Minister"
Aug 29 '07 #5

P: n/a
"Malcolm McLean" <re*******@btinternet.comwrites:
"jacob navia" <ja***@jacob.remcomp.frwrote in message
news:46**********************@news.orange.fr...
[...]
>>I am trying to compile as much code in 64 bit mode as
possible to test the 64 bit version of lcc-win.

The problem appears now that size_t is now 64 bits.
[...]
>int s;

// ...
s = strlen(str) ;

Since strlen returns a size_t, we have a 64 bit result being
assigned to a 32 bit int.

This can be correct, and in 99.9999999999999999999999999%
of the cases the string will be smaller than 2GB...

Now the problem:

Since I warn each time a narrowing conversion is done (since
that could loose data) I end up with hundreds of warnings each time
a construct like int a = strlen(...) appears. This clutters
everything, and important warnings go lost.
I do not know how to get out of this problem. Maybe any of you has
a good idea? How do you solve this when porting to 64 bits?
There's a very obvious answer to that one. As a compiler-writer, youa
re in a position to do it.
I presume the solution you're suggesting is to make int 64 bits. How
does this help? strlen() still returns size_t, and if int and size_t
are both 64 bits, there will still be size_t values that cannot be
stored in an int.

--
Keith Thompson (The_Other_Keith) ks***@mib.org <http://www.ghoti.net/~kst>
San Diego Supercomputer Center <* <http://users.sdsc.edu/~kst>
"We must do something. This is something. Therefore, we must do this."
-- Antony Jay and Jonathan Lynn, "Yes Minister"
Aug 29 '07 #6

P: n/a

"Ben Pfaff" <bl*@cs.stanford.eduwrote in message
news:87************@blp.benpfaff.org...
I'd suggest fixing the code that does this to use size_t instead
of int. size_t is correct. int is, at best, an approximation to
correct. We've just had a pretty long thread with Malcolm McLean
discussing this very topic; perhaps you should refer to that
thread, if you're not already aware of it.
Yup. As I said, if people would use size_t consistently for every single
calculation that ultimately ends up in an array index there wouldn't be such
a problem. The reality is that people won't, and lots of code doesn't.

--
Free games and programming goodies.
http://www.personal.leeds.ac.uk/~bgy1mm

Aug 29 '07 #7

P: n/a

"Keith Thompson" <ks***@mib.orgwrote in message
news:ln************@nuthaus.mib.org...
"Malcolm McLean" <re*******@btinternet.comwrites:
>"jacob navia" <ja***@jacob.remcomp.frwrote in message
news:46**********************@news.orange.fr...
[...]
>>>I am trying to compile as much code in 64 bit mode as
possible to test the 64 bit version of lcc-win.

The problem appears now that size_t is now 64 bits.
[...]
>>int s;

// ...
s = strlen(str) ;

Since strlen returns a size_t, we have a 64 bit result being
assigned to a 32 bit int.

This can be correct, and in 99.9999999999999999999999999%
of the cases the string will be smaller than 2GB...

Now the problem:

Since I warn each time a narrowing conversion is done (since
that could loose data) I end up with hundreds of warnings each time
a construct like int a = strlen(...) appears. This clutters
everything, and important warnings go lost.
I do not know how to get out of this problem. Maybe any of you has
a good idea? How do you solve this when porting to 64 bits?
There's a very obvious answer to that one. As a compiler-writer, youa
re in a position to do it.

I presume the solution you're suggesting is to make int 64 bits. How
does this help? strlen() still returns size_t, and if int and size_t
are both 64 bits, there will still be size_t values that cannot be
stored in an int.
Yes, but then you'd need an extremely long string to break the code, so the
warning can be suppressed with some confidence that it won't cause a
malfunction.

--
Free games and programming goodies.
http://www.personal.leeds.ac.uk/~bgy1mm


Aug 29 '07 #8

P: n/a
Keith Thompson wrote:
Why didn't you get the same warnings in 32-bit mode? If int and
size_t are both 32 bits, INT_MAX < SIZE_MAX, and there are values of
size_t that cannot be stored in an int. If the "narrowing conversion"
warning is based on the sizes of the type rather than the ranges, I'd
say you've just discovered a compiler bug.
2GB strings are the most you can get under the windows schema in 32 bits.
If you're getting hundreds of warnings, it's because you have hundreds
of instances of potential loss of information.
Yes, "*POTENTIALLY*" I could be missing all those strings longer
than 4GB (!!!). But I do not care about those :-)
Note that a conversion to a signed type of a value that doesn't fit in
that type yields an implementation-defined result (or, in C99, raises
an implementation-defined signal). In theory, the result could be
more than just a loss of information.
Only for strings >2GB Keith. Let's keep it realistic!
The problem is to distinguish cases where the conversion can't
actually overflow at execution times from the cases where it can.

Sufficiently clever dataflow analysis in the compiler might eliminate
some of the errors. If, given
int s = strlen(str);
the compiler knows enough about how the value of str that it can be
sure it's no longer than INT_MAX bytes, it can eliminate the warning.
But I don't know if it's practical, or even possible to eliminate
enough of the warnings this way. Doing this in most cases is hard;
doing it in all cases might be equivalent to solving the halting
problem. (That latter is only a guess.)

(Making int 64 bits won't solve the problem, since INT_MAX will still
be less than SIZE_MAX.)

You can filter the compiler's output to eliminate warnings about
narrowing implicit conversions (or, if available, use a compiler
option to turn off that particular warning), but that could miss cases
that could actually overflow.

In my opinion, the warnings are legitimate. The ideal solution is not
to suppress them, but to fix the code, assigning the result of
strlen() to a size_t rather than to an int. (Or I suppose you could
use a cast to shut up the compiler if you're *certain* the result can
never exceed INT_MAX, but that's not what I'd do.)

By compiling the code in 64-bit mode, you've discovered a number of
dormant bugs in the code.
There isn't any string longer than a few K in this program!
Of course is a potential bug, but it is practically impossible!
Aug 29 '07 #9

P: n/a
Malcolm McLean wrote:
>
"jacob navia" <ja***@jacob.remcomp.frwrote in message
news:46**********************@news.orange.fr...
>I am trying to compile as much code in 64 bit mode as
possible to test the 64 bit version of lcc-win.

The problem appears now that size_t is now 64 bits.

Fine. It has to be since there are objects that are more than 4GB
long.

The problem is, when you have in thousands of places

int s;

// ...
s = strlen(str) ;

Since strlen returns a size_t, we have a 64 bit result being
assigned to a 32 bit int.

This can be correct, and in 99.9999999999999999999999999%
of the cases the string will be smaller than 2GB...

Now the problem:

Since I warn each time a narrowing conversion is done (since
that could loose data) I end up with hundreds of warnings each time
a construct like int a = strlen(...) appears. This clutters
everything, and important warnings go lost.
I do not know how to get out of this problem. Maybe any of you has
a good idea? How do you solve this when porting to 64 bits?
There's a very obvious answer to that one. As a compiler-writer, youa re
in a position to do it.
???

(Please excuse my stupidity by I do not see it...)

Aug 29 '07 #10

P: n/a
On Aug 29, 8:08 pm, jacob navia <ja...@jacob.remcomp.frwrote:
I am trying to compile as much code in 64 bit mode as
possible to test the 64 bit version of lcc-win.

The problem appears now that size_t is now 64 bits.

Fine. It has to be since there are objects that are more than 4GB
long.

The problem is, when you have in thousands of places

int s;

// ...
s = strlen(str) ;

Since strlen returns a size_t, we have a 64 bit result being
assigned to a 32 bit int.

This can be correct, and in 99.9999999999999999999999999%
of the cases the string will be smaller than 2GB...

Now the problem:

Since I warn each time a narrowing conversion is done (since
that could loose data) I end up with hundreds of warnings each time
a construct like int a = strlen(...) appears. This clutters
everything, and important warnings go lost.

I do not know how to get out of this problem. Maybe any of you has
a good idea? How do you solve this when porting to 64 bits?
So the compiler is giving a warning when a 64 bit value is assigned to
a 32 bit variable, but not when a 32 bit unsigned value is assigned to
a 32 bit signed variable.

Well, just because you changed size_t to 64 bits doesn't make strings
any longer. strlen ("hello") still returns 5 and it will fit into an
int just as well as before. So you _could_, possibly as a compiler
option, mark certain functions as returning small(ish) values that
don't require a warning when stored in an int.

But maybe you should look at it from the point of view of a developer
who is switching from a 32 bit to a 64 bit compiler (or most likely
wants to write code that runs fine on a 32 bit and a 64 bit system),
and who _wants_ to fix problems. That programmer would _want_ the
warning and change the variable from int to something else.

Here is the approach that Apple takes: Define two typedefs, Int and
Uint (they actually use different names, but that doesn't matter).
These are used for almost all integer values. On a 32 bit system (32
bit int/long/size_t) they are equal to int/unsigned int, on a 64 bit
system (32 bit int, 64 bit long/size_t) they are equal to long/
unsigned long. Your warning problem goes away. Different types are
used on purpose so that if you mismatch int*/Int* or long*/Int* either
the 32 bit or 64 bit version will give you a compiler error.

Situations where you don't use these types: If you definitely need 64
bit, use long long. If you want to save space, use char/short/int as
suitable.

Aug 29 '07 #11

P: n/a

"jacob navia" <ja***@jacob.remcomp.frwrote in message
news:46***********************@news.orange.fr...
Malcolm McLean wrote:
>There's a very obvious answer to that one. As a compiler-writer, youa re
in a position to do it.

???

(Please excuse my stupidity by I do not see it...)
The campaign for 64 bit ints T-shirts obviously didn't generate enough
publicity. I still have a few left. XXL, one size fits all.

There are some good reasons for not making int 64 bits on a 64 bit machine,
which as a compiler-writer you will be well aware of. However typical
computers are going to have 64 bits of main address space for a very long
time to come, so it makes sense to get the language right now, and keep it
that way for the forseeable future, and not allow decisions to be dominated
by the need to maintain compatibility with legacy 32 bit libraries.
--
Free games and programming goodies.
http://www.personal.leeds.ac.uk/~bgy1mm

Aug 29 '07 #12

P: n/a
On Aug 29, 12:08 pm, jacob navia <ja...@jacob.remcomp.frwrote:
I am trying to compile as much code in 64 bit mode as
possible to test the 64 bit version of lcc-win.

The problem appears now that size_t is now 64 bits.

Fine. It has to be since there are objects that are more than 4GB
long.

The problem is, when you have in thousands of places

int s;

// ...
s = strlen(str) ;

Since strlen returns a size_t, we have a 64 bit result being
assigned to a 32 bit int.

This can be correct, and in 99.9999999999999999999999999%
of the cases the string will be smaller than 2GB...

Now the problem:

Since I warn each time a narrowing conversion is done (since
that could loose data) I end up with hundreds of warnings each time
a construct like int a = strlen(...) appears. This clutters
everything, and important warnings go lost.

I do not know how to get out of this problem. Maybe any of you has
a good idea? How do you solve this when porting to 64 bits?
Make your default int 64 bits, and be done with it.
Ought to be 64 bits on a 64 bit platform anyway.

Aug 29 '07 #13

P: n/a
On Wed, 29 Aug 2007 20:52:20 +0100, Malcolm McLean wrote:
"Ben Pfaff" <bl*@cs.stanford.eduwrote in message
news:87************@blp.benpfaff.org...
>I'd suggest fixing the code that does this to use size_t instead
of int. size_t is correct. int is, at best, an approximation to
correct. We've just had a pretty long thread with Malcolm McLean
discussing this very topic; perhaps you should refer to that
thread, if you're not already aware of it.
Yup. As I said, if people would use size_t consistently for every single
calculation that ultimately ends up in an array index there wouldn't be such
a problem. The reality is that people won't, and lots of code doesn't.
And lots of people do and lots of code does, and those people don't get
those problems on that code.

Which just goes to show, doing the right thing - using size_t - makes
perfect sense, and ignoring the right thing - as you persist in doing -
makes for problems.
Aug 29 '07 #14

P: n/a
On Aug 29, 12:51 pm, Keith Thompson <ks...@mib.orgwrote:
"Malcolm McLean" <regniz...@btinternet.comwrites:
"jacob navia" <ja...@jacob.remcomp.frwrote in message
news:46**********************@news.orange.fr...
[...]
>I am trying to compile as much code in 64 bit mode as
possible to test the 64 bit version of lcc-win.
The problem appears now that size_t is now 64 bits.

[...]


int s;
// ...
s = strlen(str) ;
Since strlen returns a size_t, we have a 64 bit result being
assigned to a 32 bit int.
This can be correct, and in 99.9999999999999999999999999%
of the cases the string will be smaller than 2GB...
Now the problem:
Since I warn each time a narrowing conversion is done (since
that could loose data) I end up with hundreds of warnings each time
a construct like int a = strlen(...) appears. This clutters
everything, and important warnings go lost.
I do not know how to get out of this problem. Maybe any of you has
a good idea? How do you solve this when porting to 64 bits?
There's a very obvious answer to that one. As a compiler-writer, youa
re in a position to do it.

I presume the solution you're suggesting is to make int 64 bits. How
does this help? strlen() still returns size_t, and if int and size_t
are both 64 bits, there will still be size_t values that cannot be
stored in an int.
If strlen() returns a number bigger than 9,223,372,036,854,775,808
then there are bigger fish to fry.
Sure, Bill Gates supposedly said that nobody will ever need more than
640K of RAM, and so someday it may be true that strings longer than 9
quintillion bytes are common. But I guess it will be a minor problem
until he can get around to fully correcting the code the right way by
assigning size_t values to the return from strlen() and other things
that return a size_t.

Aug 29 '07 #15

P: n/a
"Malcolm McLean" <re*******@btinternet.comwrites:
"Keith Thompson" <ks***@mib.orgwrote in message
news:ln************@nuthaus.mib.org...
>"Malcolm McLean" <re*******@btinternet.comwrites:
[...]
>>There's a very obvious answer to that one. As a compiler-writer, youa
re in a position to do it.

I presume the solution you're suggesting is to make int 64 bits. How
does this help? strlen() still returns size_t, and if int and size_t
are both 64 bits, there will still be size_t values that cannot be
stored in an int.
Yes, but then you'd need an extremely long string to break the code,
so the warning can be suppressed with some confidence that it won't
cause a malfunction.
That's assuming you're able to suppress the warning for 64-bit
unsigned to 64-bit signed conversions without supressing warnings for,
say, 8-bit unsigned to 8-bit signed conversions. I don't know of any
compiler that allow that kind of find-grained control.

It's better to fix the code. It's even better to write it correctly
in the first place.

--
Keith Thompson (The_Other_Keith) ks***@mib.org <http://www.ghoti.net/~kst>
San Diego Supercomputer Center <* <http://users.sdsc.edu/~kst>
"We must do something. This is something. Therefore, we must do this."
-- Antony Jay and Jonathan Lynn, "Yes Minister"
Aug 29 '07 #16

P: n/a
In article <46**********************@news.orange.fr>,
jacob navia <ja***@jacob.remcomp.frwrote:
> s = strlen(str) ;

Since strlen returns a size_t, we have a 64 bit result being
assigned to a 32 bit int.

This can be correct, and in 99.9999999999999999999999999%
of the cases the string will be smaller than 2GB...
Clearly with strlen() the chance of it being an error is negligible.
And I think this is true other size_t->int assignments. For example,
int s = sizeof(whatever) is almost never a problem.

Ideally, I would suggest not generating a warning unless some option
is set for it. (There should always be a "maximally paranoid" option
to help track down obscure errors.) But that only applies to
size_t->int assignments. Other 64->32 assignments may be more likely to be
in error. At the point you generate the warning, can you still tell
that it's a size_t rather than some other 64-bit int type?

-- Richard
--
"Consideration shall be given to the need for as many as 32 characters
in some alphabets" - X3.4, 1963.
Aug 29 '07 #17

P: n/a
On Aug 29, 3:05 pm, rich...@cogsci.ed.ac.uk (Richard Tobin) wrote:
In article <46d5c46d$0$5108$ba4ac...@news.orange.fr>,
jacob navia <ja...@jacob.remcomp.frwrote:
s = strlen(str) ;
Since strlen returns a size_t, we have a 64 bit result being
assigned to a 32 bit int.
This can be correct, and in 99.9999999999999999999999999%
of the cases the string will be smaller than 2GB...

Clearly with strlen() the chance of it being an error is negligible.
And I think this is true other size_t->int assignments. For example,
int s = sizeof(whatever) is almost never a problem.

Ideally, I would suggest not generating a warning unless some option
is set for it. (There should always be a "maximally paranoid" option
to help track down obscure errors.) But that only applies to
size_t->int assignments. Other 64->32 assignments may be more likely to be
in error. At the point you generate the warning, can you still tell
that it's a size_t rather than some other 64-bit int type?
I doubt that the chance a string is longer than 2GB is always
negligible.

Consider the characters 'C', 'T', 'A', 'G' in various combinations in
a long sequence of (say) 3 billion.
That's the human genome.

The Chrysanthemum genome is much bigger.

I know of people using database systems to do genetics research. The
probability of long character sequences on those systems is not
negligible.

If the machine is capable of handling large data, right away people
will start to do it.

Aug 29 '07 #18

P: n/a
In article <11**********************@19g2000hsx.googlegroups. com>,
user923005 <dc*****@connx.comwrote:
>Make your default int 64 bits, and be done with it.
Ought to be 64 bits on a 64 bit platform anyway.
A compiler for an existing operating system needs to fit in with the
system's libraries, so he may not have that choice.

-- Richard
--
"Consideration shall be given to the need for as many as 32 characters
in some alphabets" - X3.4, 1963.
Aug 29 '07 #19

P: n/a
In article <ln************@nuthaus.mib.org>,
Keith Thompson <ks***@mib.orgwrote:
>It's better to fix the code. It's even better to write it correctly
in the first place.
But int s = sizeof(char *) is not broken, even though sizeof() returns
a size_t.

-- Richard
--
"Consideration shall be given to the need for as many as 32 characters
in some alphabets" - X3.4, 1963.
Aug 29 '07 #20

P: n/a
In article <11**********************@22g2000hsm.googlegroups. com>,
user923005 <dc*****@connx.comwrote:
>I doubt that the chance a string is longer than 2GB is always
negligible.
"Always negligible" is irrelevant. Of course it's not negligible in
programs chosen to demonstrate the problem.
>Consider the characters 'C', 'T', 'A', 'G' in various combinations in
a long sequence of (say) 3 billion.
That's the human genome.
The chance of a given program being one that stores the complete human
genome in a string is negligible. People with such programs can set the
option I suggested.

-- Richard
--
"Consideration shall be given to the need for as many as 32 characters
in some alphabets" - X3.4, 1963.
Aug 29 '07 #21

P: n/a

"Richard Tobin" <ri*****@cogsci.ed.ac.ukwrote in message
news:fb***********@pc-news.cogsci.ed.ac.uk...
The chance of a given program being one that stores the complete human
genome in a string is negligible. People with such programs can set the
option I suggested.
I work in that field.
Whilst generally you'd want a "rope" type-structure to handle such a long
sequence, there might well be reasons for storing the whole genome as a flat
string. Certainly if I had a 64-bit machine with enough memory installed, I
would expect to have the option, and I'd expect to be able to write the
program in regular C.

--
Free games and programming goodies.
http://www.personal.leeds.ac.uk/~bgy1mm
Aug 29 '07 #22

P: n/a
jacob navia <ja***@jacob.remcomp.frwrites:
Keith Thompson wrote:
>Why didn't you get the same warnings in 32-bit mode? If int and
size_t are both 32 bits, INT_MAX < SIZE_MAX, and there are values of
size_t that cannot be stored in an int. If the "narrowing conversion"
warning is based on the sizes of the type rather than the ranges, I'd
say you've just discovered a compiler bug.

2GB strings are the most you can get under the windows schema in 32 bits.
Ok. Does your compiler know that?

Assigning an arbitrary size_t value to an object of type int, if both
types are 32 bits, could potentially overflow. Your compiler
apparently doesn't issue a warning in that case. Is it because it
knows that the value returned by strlen() can't exceed INT_MAX (if so,
well done, especially since it seems to be smart enough not to make
that assumption on a 64-bit system), or is it because it doesn't issue
a warning when both types are the same size?

For example:

size_t s = func(-1);
/* Assume func() takes a size_t argument and returns it.
Assume func() is defined in another translation unit,
so the compiler can't analyze its definition. In other
words, 's' is initialized to SIZE_MAX, but the compiler
can't make any assumptions about its value. */

signed char c = s;
/* Presumably this produces a warning. */

int i = s;
/* This is a potential overflow. Does this produce
a warning? Should it? */

If your compiler warns about the initialization of 'c' but not about
the initialization of 'i', then IMHO it's being inconsistent. This
doesn't address your original question, but it's related.

[...]
There isn't any string longer than a few K in this program!
Of course is a potential bug, but it is practically impossible!
You know that, and I know that, but what matters is what the compiler
knows.

Is it conceivable that a bug in the program and/or some unexpected
input could cause it to create a string longer than 2GB?

You asked how to suppress the bogus warnings without losing any valid
warnings. To do that, your compiler, or some other tool, has to be
able to tell the difference. Telling me that none of the strings are
longer than 2GB doesn't address that concern, unless you can convey
that knowledge to the compiler.

--
Keith Thompson (The_Other_Keith) ks***@mib.org <http://www.ghoti.net/~kst>
San Diego Supercomputer Center <* <http://users.sdsc.edu/~kst>
"We must do something. This is something. Therefore, we must do this."
-- Antony Jay and Jonathan Lynn, "Yes Minister"
Aug 29 '07 #23

P: n/a
Richard Tobin wrote:
In article <11**********************@22g2000hsm.googlegroups. com>,
user923005 <dc*****@connx.comwrote:
>I doubt that the chance a string is longer than 2GB is always
negligible.

"Always negligible" is irrelevant. Of course it's not negligible in
programs chosen to demonstrate the problem.
>Consider the characters 'C', 'T', 'A', 'G' in various combinations in
a long sequence of (say) 3 billion.
That's the human genome.

The chance of a given program being one that stores the complete human
genome in a string is negligible. People with such programs can set the
option I suggested.

-- Richard
The program has strings of at most a few K. It is an IDE (Integrated
development environment, debugger, etc)

An int can hold string lengths of more than 2 billion... MORE than
enough for this environment. This program has been running under 32 bit
windows where all user space is at most 2GB.
Aug 29 '07 #24

P: n/a
"Malcolm McLean" <re*******@btinternet.comwrites:
"jacob navia" <ja***@jacob.remcomp.frwrote in message
news:46***********************@news.orange.fr...
>Malcolm McLean wrote:
>>There's a very obvious answer to that one. As a compiler-writer,
youa re in a position to do it.

???

(Please excuse my stupidity by I do not see it...)
The campaign for 64 bit ints T-shirts obviously didn't generate enough
publicity. I still have a few left. XXL, one size fits all.
One *shirt* fits all (unless somebody other than you actually wants
one).
There are some good reasons for not making int 64 bits on a 64 bit
machine, which as a compiler-writer you will be well aware of. However
typical computers are going to have 64 bits of main address space for
a very long time to come, so it makes sense to get the language right
now, and keep it that way for the forseeable future, and not allow
decisions to be dominated by the need to maintain compatibility with
legacy 32 bit libraries.
lcc-win32 (and presumably lcc-win64, if that's what it's called) is a
Windows compiler. jacob does not have the option of changing the
Windows API, and a compiler that's incompatible with the underlying
operating system isn't going to be very useful.

--
Keith Thompson (The_Other_Keith) ks***@mib.org <http://www.ghoti.net/~kst>
San Diego Supercomputer Center <* <http://users.sdsc.edu/~kst>
"We must do something. This is something. Therefore, we must do this."
-- Antony Jay and Jonathan Lynn, "Yes Minister"
Aug 29 '07 #25

P: n/a
Malcolm McLean wrote:
>
"Richard Tobin" <ri*****@cogsci.ed.ac.ukwrote in message
news:fb***********@pc-news.cogsci.ed.ac.uk...
>The chance of a given program being one that stores the complete human
genome in a string is negligible. People with such programs can set the
option I suggested.
I work in that field.
Whilst generally you'd want a "rope" type-structure to handle such a
long sequence, there might well be reasons for storing the whole genome
as a flat string. Certainly if I had a 64-bit machine with enough memory
installed, I would expect to have the option, and I'd expect to be able
to write the program in regular C.
YES SIR!

With my new lcc-win32 YOU WILL BE ABLE TO DO IT!

But I am not speaking of that program. I am speaking about
other programs I am PORTING from 32 bit, whose strings are never
bigger than a few Kbytes at most!

Aug 29 '07 #26

P: n/a
Keith Thompson wrote:
lcc-win32 (and presumably lcc-win64, if that's what it's called) is a
Windows compiler. jacob does not have the option of changing the
Windows API, and a compiler that's incompatible with the underlying
operating system isn't going to be very useful.
Yes. Mr Gates decided that

sizeof(int) == sizeof(long) == 4.

Only long long is 64 bits. PLease address alll flames to him.

NOT TO ME!!!

:-)
Aug 29 '07 #27

P: n/a

jacob navia:
The problem is, when you have in thousands of places

int s;

// ...
s = strlen(str) ;

Since strlen returns a size_t, we have a 64 bit result being
assigned to a 32 bit int.
<snip>
I do not know how to get out of this problem. Maybe any of you has
a good idea? How do you solve this when porting to 64 bits?
Assuming that you've a shred of intelligence, I'm led to believe that
you suffer from "int syndrome".

"int syndrome" reminds me of old drivers, the kind of people who
always drive the canonical route somewhere. Even during rush-hour,
even at night when the streets are clear, they always take the same
route. I don't know if you'd call it stubbornness or stupidity. They
lack dynamic-ity.

These drivers remind me of the programmers who are "int" people. The
solution to your boggle is so blatantly oblivious that I'm not even
gonna mention what the solution is.

The real problem is why you feel so indoctrinated into using int,
especially places where you shouldn't be using it.

If you want advice though, I'd say use the appropriate types where
appropriate, and to edit any code that uses types wrongly.

Martin

Aug 30 '07 #28

P: n/a
jacob navia wrote:
>
I am trying to compile as much code in 64 bit mode as
possible to test the 64 bit version of lcc-win.

The problem appears now that size_t is now 64 bits.

Fine. It has to be since there are objects that are more than
4GB long. The problem is, when you have in thousands of places

int s;

// ...
s = strlen(str) ;

Since strlen returns a size_t, we have a 64 bit result being
assigned to a 32 bit int.

This can be correct, and in 99.9999999999999999999999999%
of the cases the string will be smaller than 2GB...
Simply define s as a long long or (better) as a size_t.

--
Chuck F (cbfalconer at maineline dot net)
Available for consulting/temporary embedded and systems.
<http://cbfalconer.home.att.net>

--
Posted via a free Usenet account from http://www.teranews.com

Aug 30 '07 #29

P: n/a
jacob navia wrote:
>
.... snip ...
>
int s = strlen(str) is NOT broken.
Yes it is. How can you guarantee that strlen never returns a value
that exceeds the capacity of an int? However:

size_t s = strlen(str);

is NOT broken, assuming suitable #includes and definitions.

--
Chuck F (cbfalconer at maineline dot net)
Available for consulting/temporary embedded and systems.
<http://cbfalconer.home.att.net>

--
Posted via a free Usenet account from http://www.teranews.com

Aug 30 '07 #30

P: n/a
CBFalconer <cb********@yahoo.comwrites:
jacob navia wrote:
... snip ...
>>
int s = strlen(str) is NOT broken.

Yes it is. How can you guarantee that strlen never returns a value
that exceeds the capacity of an int?
By never passing it a pointer to a string longer than INT_MAX
characters. This tends to be easier than, for example, guaranteeing
that 'x + y' will never overflow.

The declaration may or may not be broken, depending on what happens at
run time. The problem is that, apparently, the programmer knows it's
safe, but the compiler doesn't have enough information to prove it.

The ideal solution is to declare s as a size_t, and to make whatever
other code changes follow from that, but that's not always practical.

--
Keith Thompson (The_Other_Keith) ks***@mib.org <http://www.ghoti.net/~kst>
San Diego Supercomputer Center <* <http://users.sdsc.edu/~kst>
"We must do something. This is something. Therefore, we must do this."
-- Antony Jay and Jonathan Lynn, "Yes Minister"
Aug 30 '07 #31

P: n/a
Martin Wells wrote:
jacob navia:
>The problem is, when you have in thousands of places

int s;

// ...
s = strlen(str) ;

Since strlen returns a size_t, we have a 64 bit result being
assigned to a 32 bit int.
<snip>
>I do not know how to get out of this problem. Maybe any of you has
a good idea? How do you solve this when porting to 64 bits?

Assuming that you've a shred of intelligence, I'm led to believe that
you suffer from "int syndrome".

"int syndrome" reminds me of old drivers, the kind of people who
always drive the canonical route somewhere. Even during rush-hour,
even at night when the streets are clear, they always take the same
route. I don't know if you'd call it stubbornness or stupidity. They
lack dynamic-ity.

These drivers remind me of the programmers who are "int" people. The
solution to your boggle is so blatantly oblivious that I'm not even
gonna mention what the solution is.

The real problem is why you feel so indoctrinated into using int,
especially places where you shouldn't be using it.

If you want advice though, I'd say use the appropriate types where
appropriate, and to edit any code that uses types wrongly.

Martin
Assuming that you have a shred of intelligence, you will be able
to understand this:

That int is used in many other contexts later, for instance
comparing it with other integers.
int i,len = strlen(str);

for (i=0; i<len; i++) {
/// etc
}
The i<len comparison would provoke a warning if len is unsigned...

If I make i unsigned too, then its usage within the loop will provoke
even more problems!
Aug 30 '07 #32

P: n/a
Ian Collins said:
jacob navia wrote:
<snip>
>int s = strlen(str) is NOT broken.

Why would you want to assign an unsigned value to an int? Why do you
think it makes sense to have a negative size?
Well, obviously it doesn't make any sense at all, and assigning strlen's
result to an int is clearly wrong; strlen yields size_t, not int.

On the other hand, does it really make sense to play with trolls?

--
Richard Heathfield <http://www.cpax.org.uk>
Email: -www. +rjh@
Google users: <http://www.cpax.org.uk/prg/writings/googly.php>
"Usenet is a strange place" - dmr 29 July 1999
Aug 30 '07 #33

P: n/a
jacob navia wrote:
>
That int is used in many other contexts later, for instance
comparing it with other integers.
int i,len = strlen(str);

for (i=0; i<len; i++) {
/// etc
}
The i<len comparison would provoke a warning if len is unsigned...

If I make i unsigned too, then its usage within the loop will provoke
even more problems!
Name one.

--
Ian Collins.
Aug 30 '07 #34

P: n/a
Richard Heathfield wrote:
>
On the other hand, does it really make sense to play with trolls?
It beats work...

--
Ian Collins.
Aug 30 '07 #35

P: n/a
Ian Collins wrote:
Richard Heathfield wrote:
>On the other hand, does it really make sense to play with trolls?
It beats work...
OK. You win. Will not answer any posts from you.
Aug 30 '07 #36

P: n/a
jacob navia wrote:
Ian Collins wrote:
>Richard Heathfield wrote:
>>On the other hand, does it really make sense to play with trolls?
It beats work...

OK. You win. Will not answer any posts from you.
Bad humour day today?

You normally stop once you realise I'm correct...

--
Ian Collins.
Aug 30 '07 #37

P: n/a
jacob navia wrote:
int s;

// ...
s = strlen(str) ;

Since strlen returns a size_t, we have a 64 bit result being
assigned to a 32 bit int.
....
Since I warn each time a narrowing conversion is done (since
that could loose data) I end up with hundreds of warnings each time
a construct like int a = strlen(...) appears. This clutters
everything, and important warnings go lost.
I suggest a warning switch for the 64 bit to 32 bit conversion separate
from warnings for other narrowing conversions.

--
Thad
Aug 30 '07 #38

P: n/a
jacob navia <ja***@jacob.remcomp.frwrites:
Ian Collins wrote:
>Why would you want to assign an unsigned value to an int? Why do you
think it makes sense to have a negative size?

Because that int is used in many other contexts later, for instance
comparing it with other integers.
int len = strlen(str);

for (i=0; i<len; i++) {
/// etc
}
The i<len comparison would provoke a warning if len is unsigned...
Only if 'i' is declared as type 'int'. If you declare it to have
type 'size_t', you will not have a problem.
--
char a[]="\n .CJacehknorstu";int putchar(int);int main(void){unsigned long b[]
={0x67dffdff,0x9aa9aa6a,0xa77ffda9,0x7da6aa6a,0xa6 7f6aaa,0xaa9aa9f6,0x11f6},*p
=b,i=24;for(;p+=!*p;*p/=4)switch(0[p]&3)case 0:{return 0;for(p--;i--;i--)case+
2:{i++;if(i)break;else default:continue;if(0)case 1:putchar(a[i&15]);break;}}}
Aug 30 '07 #39

P: n/a
Ian Collins said:
jacob navia wrote:
>Ian Collins wrote:
>>Richard Heathfield wrote:
On the other hand, does it really make sense to play with trolls?

It beats work...

OK. You win. Will not answer any posts from you.

Bad humour day today?

You normally stop once you realise I'm correct...
Some people's pennies are in orbit.

--
Richard Heathfield <http://www.cpax.org.uk>
Email: -www. +rjh@
Google users: <http://www.cpax.org.uk/prg/writings/googly.php>
"Usenet is a strange place" - dmr 29 July 1999
Aug 30 '07 #40

P: n/a
Ben Pfaff wrote:
jacob navia <ja***@jacob.remcomp.frwrites:
>Ian Collins wrote:
>>Why would you want to assign an unsigned value to an int? Why do you
think it makes sense to have a negative size?
Because that int is used in many other contexts later, for instance
comparing it with other integers.
int len = strlen(str);

for (i=0; i<len; i++) {
/// etc
}
The i<len comparison would provoke a warning if len is unsigned...

Only if 'i' is declared as type 'int'. If you declare it to have
type 'size_t', you will not have a problem.
Of course, but that will lead to MORE changes in a chain reaction
that looks quite dangerous...
Aug 30 '07 #41

P: n/a
jacob navia <ja***@jacob.remcomp.frwrites:
Ben Pfaff wrote:
>jacob navia <ja***@jacob.remcomp.frwrites:
>>Ian Collins wrote:
Why would you want to assign an unsigned value to an int? Why do you
think it makes sense to have a negative size?
Because that int is used in many other contexts later, for instance
comparing it with other integers.
int len = strlen(str);

for (i=0; i<len; i++) {
/// etc
}
The i<len comparison would provoke a warning if len is unsigned...

Only if 'i' is declared as type 'int'. If you declare it to have
type 'size_t', you will not have a problem.

Of course, but that will lead to MORE changes in a chain reaction
that looks quite dangerous...
It is of course possible to run into problems. If you have code
that you know to work in a given environment, then you may not
want to fix it, because it may break that code in that same
environment if you fail to understand the consequences of the
series of changes. But in this case you're talking about moving
the code to a new environment anyhow (32- to 64-bit), in which
case the code has to be tested anew. The choice is then between
maintaining the old version and the new version separately, as
different pieces of code, or making sure that the fixed version
works in both environments. Most of the time, I'd choose the
latter.
--
char a[]="\n .CJacehknorstu";int putchar(int);int main(void){unsigned long b[]
={0x67dffdff,0x9aa9aa6a,0xa77ffda9,0x7da6aa6a,0xa6 7f6aaa,0xaa9aa9f6,0x11f6},*p
=b,i=24;for(;p+=!*p;*p/=4)switch(0[p]&3)case 0:{return 0;for(p--;i--;i--)case+
2:{i++;if(i)break;else default:continue;if(0)case 1:putchar(a[i&15]);break;}}}
Aug 30 '07 #42

P: n/a
Ben Pfaff said:
jacob navia <ja***@jacob.remcomp.frwrites:
>Ben Pfaff wrote:
>>jacob navia <ja***@jacob.remcomp.frwrites:
<snip>
>>>The i<len comparison would provoke a warning if len is unsigned...

Only if 'i' is declared as type 'int'. If you declare it to have
type 'size_t', you will not have a problem.

Of course, but that will lead to MORE changes in a chain reaction
that looks quite dangerous...

It is of course possible to run into problems.
It is also possible to steer clear of problems. The "chain reaction"
simply doesn't happen if everything has the right type to start off
with. And if it doesn't, the chain reaction is a good thing, not a bad
thing, because it reveals type misconceptions in the code.

--
Richard Heathfield <http://www.cpax.org.uk>
Email: -www. +rjh@
Google users: <http://www.cpax.org.uk/prg/writings/googly.php>
"Usenet is a strange place" - dmr 29 July 1999
Aug 30 '07 #43

P: n/a
Keith Thompson wrote:
CBFalconer <cb********@yahoo.comwrites:
>jacob navia wrote:
... snip ...
>>>
int s = strlen(str) is NOT broken.

Yes it is. How can you guarantee that strlen never returns a value
that exceeds the capacity of an int?

By never passing it a pointer to a string longer than INT_MAX
characters. This tends to be easier than, for example, guaranteeing
that 'x + y' will never overflow.

The declaration may or may not be broken, depending on what happens at
run time. The problem is that, apparently, the programmer knows it's
safe, but the compiler doesn't have enough information to prove it.

The ideal solution is to declare s as a size_t, and to make whatever
other code changes follow from that, but that's not always practical.
Which I said, and you snipped. Why?

--
Chuck F (cbfalconer at maineline dot net)
Available for consulting/temporary embedded and systems.
<http://cbfalconer.home.att.net>
--
Posted via a free Usenet account from http://www.teranews.com

Aug 30 '07 #44

P: n/a

"jacob navia" <ja***@jacob.remcomp.frwrote in message
news:46***********************@news.orange.fr...
Ben Pfaff wrote:
>Only if 'i' is declared as type 'int'. If you declare it to have
type 'size_t', you will not have a problem.

Of course, but that will lead to MORE changes in a chain reaction
that looks quite dangerous...
Now you are realising the problem.
In fact if you use size_t safely and consistently, virtually all ints need
to be size_t's. The committee have managed to produce a very far-reaching
change to the C language, simply though fixing up a slight problem in the
interface to malloc().

--
Free games and programming goodies.
http://www.personal.leeds.ac.uk/~bgy1mm
Aug 30 '07 #45

P: n/a
CBFalconer <cb********@yahoo.comwrites:
Keith Thompson wrote:
>CBFalconer <cb********@yahoo.comwrites:
>>jacob navia wrote:
... snip ...

int s = strlen(str) is NOT broken.

Yes it is. How can you guarantee that strlen never returns a value
that exceeds the capacity of an int?

By never passing it a pointer to a string longer than INT_MAX
characters. This tends to be easier than, for example, guaranteeing
that 'x + y' will never overflow.

The declaration may or may not be broken, depending on what happens at
run time. The problem is that, apparently, the programmer knows it's
safe, but the compiler doesn't have enough information to prove it.

The ideal solution is to declare s as a size_t, and to make whatever
other code changes follow from that, but that's not always practical.

Which I said, and you snipped. Why?
Sorry, I didn't realize I was repeating some of what you said.

--
Keith Thompson (The_Other_Keith) ks***@mib.org <http://www.ghoti.net/~kst>
San Diego Supercomputer Center <* <http://users.sdsc.edu/~kst>
"We must do something. This is something. Therefore, we must do this."
-- Antony Jay and Jonathan Lynn, "Yes Minister"
Aug 30 '07 #46

P: n/a
Richard Heathfield <rj*@see.sig.invalidwrites:
Ben Pfaff said:
>jacob navia <ja***@jacob.remcomp.frwrites:
>>Ben Pfaff wrote:
jacob navia <ja***@jacob.remcomp.frwrites:
<snip>
>>>>The i<len comparison would provoke a warning if len is unsigned...

Only if 'i' is declared as type 'int'. If you declare it to have
type 'size_t', you will not have a problem.

Of course, but that will lead to MORE changes in a chain reaction
that looks quite dangerous...

It is of course possible to run into problems.

It is also possible to steer clear of problems. The "chain reaction"
simply doesn't happen if everything has the right type to start off
with. And if it doesn't, the chain reaction is a good thing, not a bad
thing, because it reveals type misconceptions in the code.
Ridiculous. Everything doesn't have the "right type" to start
with. Hence the chain reaction.

Millions of programmers the world over use int as a size store for
strings they know to be only a "few bytes" long. It might not be "right"
now, but there is a huge legacy of it.

A chain reaction of this type in a huge legacy code base could cause
all sorts of side effects. You tell the head of QA that moving from int
to size_t will "just work". Not in the real world it doesn't.
Aug 30 '07 #47

P: n/a
jacob navia wrote:
Ben Pfaff wrote:
>jacob navia <ja***@jacob.remcomp.frwrites:
>>Ian Collins wrote:

Why would you want to assign an unsigned value to an int? Why
do you think it makes sense to have a negative size?

Because that int is used in many other contexts later, for instance
comparing it with other integers.

int len = strlen(str);

for (i=0; i<len; i++) {
/// etc
}

The i<len comparison would provoke a warning if len is unsigned...

Only if 'i' is declared as type 'int'. If you declare it to have
type 'size_t', you will not have a problem.

Of course, but that will lead to MORE changes in a chain reaction
that looks quite dangerous...
No, that will eventually lead to more accurate code with fewer
concealed traps. This is C, not B. The fact that you ignore all
these recommendations is a strong indication that your code is not
safe to use.

--
Chuck F (cbfalconer at maineline dot net)
Available for consulting/temporary embedded and systems.
<http://cbfalconer.home.att.net>

--
Posted via a free Usenet account from http://www.teranews.com

Aug 30 '07 #48

P: n/a
jacob navia wrote:
>
.... snip ...
>
That int is used in many other contexts later, for instance
comparing it with other integers.

int i,len = strlen(str);

for (i=0; i<len; i++) {
/// etc
}

The i<len comparison would provoke a warning if len is unsigned...

If I make i unsigned too, then its usage within the loop will
provoke even more problems!
Why? Nothing can create a problem unless you pass its value to an
int, and that value is outside the range that that int can
express. If that happens, lo, you have found a bug. Meanwhile you
have the opportunity to use shift operations on it, to overflow it
without creating unsomething situations, etc.

--
Chuck F (cbfalconer at maineline dot net)
Available for consulting/temporary embedded and systems.
<http://cbfalconer.home.att.net>

--
Posted via a free Usenet account from http://www.teranews.com

Aug 30 '07 #49

P: n/a
Richard Heathfield wrote:
Ian Collins said:
>jacob navia wrote:

<snip>
>>int s = strlen(str) is NOT broken.

Why would you want to assign an unsigned value to an int? Why do
you think it makes sense to have a negative size?

Well, obviously it doesn't make any sense at all, and assigning
strlen's result to an int is clearly wrong; strlen yields size_t,
not int.

On the other hand, does it really make sense to play with trolls?
Now that is not fair. Yes, Jacob has peculiar (and many are
unsound) ideas, but that does not make him a troll. He seems to
have co-operated on advertising his compiler, for example, without
specifically acknowledgeing so doing.

--
Chuck F (cbfalconer at maineline dot net)
Available for consulting/temporary embedded and systems.
<http://cbfalconer.home.att.net>

--
Posted via a free Usenet account from http://www.teranews.com

Aug 30 '07 #50

409 Replies

This discussion thread is closed

Replies have been disabled for this discussion.