In comp.lang.javascript message <11*********************@g4g2000hsf.goog

legroups.com>, Fri, 15 Jun 2007 02:31:26, dd <dd****@gmail.composted:

>On Jun 15, 1:00 am, "FAQ server" <javascr...@dotinternet.bewrote:
>Why does 5 * 1.015 != 5.075 or 0.06+0.01 != 0.07?

Perhaps this could use a bit more explanation. When

I first looked at this it's not obvious that it's

mostly just a comparison problem.

Ideally, that would be because it isn't mostly a comparison problem,

AFAICS; certainly not predominantly so. It's also a visible-rendition

problem.

Someone might

think "well if it's not 5.075 then JavaScript math

is crazy". It's not clear that the answer is really

something like 5.0750000001 and just needs a quick

multiply by 100, convert to integer and then divide

by 100 to make it correct.

That won't make the first example work.

But it would be better to put the Subject in English rather than

geekish, and use the Subject expressions as introductory.

4.7 Why does simple decimal arithmetic give strange results?

For example, 5*1.015 does not give exactly 5.075 and 0.06+0.01 does

not give exactly 0.07 in javascript.

Javascript numbers are represented in binary ...

Then change digits; integers to digits. Integers for legibility.

And add the points that it's better to divide by 100 than to multiply by

0.01, since 100 is held exactly and 0.01 cannot be; and that one should

where possible work in integers throughout.

--

(c) John Stockton, Surrey, UK. ?@merlyn.demon.co.uk Turnpike v6.05 IE 6

news:comp.lang.javascript FAQ <URL:http://www.jibbering.com/faq/index.html>.

<URL:http://www.merlyn.demon.co.uk/js-index.htmjscr maths, dates, sources.

<URL:http://www.merlyn.demon.co.uk/TP/BP/Delphi/jscr/&c, FAQ items, links.