Espen wrote on 26 mei 2006 in comp.lang.javascript
:
<SCRIPT language="JavaScript">
Do:
<script type='text/javascript'>
Rente = "5,9";
Rente2 = parseFloat(Rente.replace(",",".")/100);
document.write(Rente2 + '<br>');
Why would you first devide by 100,
and then, the result being numeric or has errored out,
doe a parseFloat() ???????????
Rente = "4,9";
Rente2 = parseFloat(Rente.replace(",",".")/100);
document.write(Rente2 + '<br>');
</SCRIPT>
Result:
0.059000000000000004
0.049
Problem:
Where does the "000000000000004" come from? Are there a workaround?
This, Espen, has been told hundreds of times on this NG, [see the
archive]:
because some binary coded floating point values canot be exactly
converted to decimal fractions and others can.
There are only two ways, apart from having a language that stores decimal
fractions:
1 do decent rounding and a final string manipulation
2 use only integers and display them with a decimal point with string
manipulation.
the NG-FAQ
<http://www.jibbering.com/faq/>
says:
Javascript numbers are represented in binary as IEEE-754 Doubles, with a
resolution of 53 bits, giving an accuracy of 15-16 decimal digits;
integers up to about 9e15 are precise, but few decimal fractions are.
Given this, arithmetic is as exact as possible, but no more. Operations
on integers are exact if the true result and all intermediates are
integers within that range.
--
Evertjan.
The Netherlands.
(Please change the x'es to dots in my emailaddress)