I have:
var a = 0.0532;
var b = a * 100;
b should be returning 5.32 but instead it's returning 5.319999999999999. How do I fix this?
JSFiddle here: http://jsfiddle.net/9f2K8/
I have:
var a = 0.0532;
var b = a * 100;
b should be returning 5.32 but instead it's returning 5.319999999999999. How do I fix this?
JSFiddle here: http://jsfiddle.net/9f2K8/
you should use .toFixed()
var a = 0.0532;
var b = a * 100;
b.toFixed(2); //specify number of decimals to be displayed
This is not an error.
Javascript is trying to represent 5.32
with as much precision as possible. Since computers don't have infinite precision, it picks the closest number it can: 5.319999999999999
.
If your problem lies with numerical operations, you should be able to add/multiply/etc these numbers without problem. They so close to the intended number that results will be within a negligible margin of error.
If your problem lies with comparing numbers, the common approach is to ditch ==
and instead compare using a defined margin of error. For example:
// Two previously obtained instances of the "same" number:
a = 5.32
b = 5.319999999999999
// Don't do this:
if (a == b) {}
// Do this instead (hide it in a function):
margin = 0.000001
if (Math.abs(a - b) < margin) {}
If your problem is visual, you can use toFixed()
to create a rounded human-readable string:
number = 123.4567
number.toFixed(2)
> '123.46'