Someone asked a js question:
var i = 0.07;
var r = i*100;
alert(r);
Why is the result 7.0000000000000001?
After checking the information, we actually know that in JavsScript, when variables are stored There is no distinction between number and float types, but they are stored uniformly as float. JavaScript uses the 64-bit floating point format defined by the IEEE 754-2008 standard to store numbers. According to the definition of IEEE 754: http://en.wikipedia.org/wiki/IEEE_754-2008
The length of the integer corresponding to decimal64 is 10, The length of the decimal part is 16, so the default calculation result is "7.0000000000000001". If the last decimal is 0, take 1 as the valid digit flag.
Similarly, we can imagine that the result of 1/3 should be 0.3333333333333333.
So how to correct this value?
You can use the following methods:
1. parseInt
var r4=parseInt(i*100);
2. Math.round var r2=Math.round((i*100)*1000)/1000;
Both of the above two methods can get 7
Attached is the full test code:
Test script