"mudman" <mu****@discussions.microsoft.com> wrote in message
news:94**********************************@microsof t.com...
I'm running Visual Studio.NET and am experiencing a problem during
debugging
(i assume this will also be the case during normal operation). If the
variable "test" is a double and "denom" a finite integer, the operation
test
= 1 / denom; results in test having a value of 0.000000! However, if denom
is
also a double, test has the correct value. Dumb question perhaps, but am i
missing something?
If you have something like this
int denom;
double test;
test = 1 / denom;
then test will be assigned a value of zero (where denom is not 0). That's
because the division happens in integers, it yields a result of 0 and
integer 0 is converted to double 0.
If on the other hand you have an expression that contains integers and
doubles then the integers are promoted to doubles before the division and
assignment happen.
Is that what you see?
Regards,
Will