"KimD" <Ki**@discussions.microsoft.com> wrote in message

news:3F**********************************@microsof t.com...

Just made a startling discovery - C# interprets 4/3 as 1 and 2/3 as 0 etc.

C# needs explicit statement of numerator and denominator as float/double

etc. e.g. 4d/3d or 4.0/3.0 to be interpreted correctly

This seems really archaic...surely i'm doing it wrong and there's

something

i'm missing!?

This is normal behavior for integer division, found in C, C++, Java and in

C#. I'm sure there are many other languages that share the same approach on

integer division. Any competently taught beginning programming class will

pound this concept into the heads of students. The result is determined by

the operand types, *not* by the type where the result is stored.

IOW, double res = 2/3; will not give you 0.6666..., but 0 in res as the

result is already determined by the division operation on two integer

literals.

If you divide one integer into another, the result will be an integer.

Casting one or the other operand to a real number type, as you've shown,

gets the other operand promoted to the "higher" type and preserves the

precision in the result.

It's *not* a C# thing!

--

Peter [MVP Visual Developer]

Jack of all trades, master of none.