0

I find myself needing to do some work with pretty small decimals, formatted a particular way, and javascript is doing weird things to them that I don't understand.

Each number enters the function formatted like this: 0.0000000000. That's ten decimal places. Most of the time, that will mean trailing zeros, like this: 0.0011000000.

Now, I wouldn't think this would matter, because before doing any other work with them, I turn them into integers, through the simple expedience of multiplying them by exactly 10000000000.

But when I do that, I sometimes get results that look like this:

var bigNum = 0.0000050000 * 10000000000; //Returns a value of 50000.00000000001

What the heck is going on here? Where is that extra 1 at the end coming from?

Damon Kaswell
  • 1,270
  • 1
  • 10
  • 16
  • 1
    Basic principle of how floats work. – Niet the Dark Absol Apr 25 '13 at 18:32
  • If you know the exact number of decimals, consider having your number as a string: `var bigNum = parseInt("0.0000050000".replace(".",""),10);` – Niet the Dark Absol Apr 25 '13 at 18:34
  • 1
    You might want to look into many of the big number libraries out there for javascript (depending on your needs), as javascript does use floating point numbers. – Daniel Moses Apr 25 '13 at 18:34
  • I feel dumb now. For some damn fool reason, I'd forgotten about the floats. Duuuuurrr... Using toString() should do the trick. And now I'm going to spend a little time kicking myself. – Damon Kaswell Apr 25 '13 at 18:42

0 Answers0