1

I wrote a quick JavaScript for a simple calculation. When I ran the script it had a strange output that I can't explain...

The script was, in its most basic form this:

for (var a=0;a<100;a++) {
    var b= 3.6*a
    document.write( b + "<br />");
} 

Here's the demo: http://jsfiddle.net/uKa2G/

Expecting that I would get straight single or no decimal answers, but every few lines there would be a huge decimal.

If someone can explain why it would be appreciated.

Aadit M Shah
  • 72,912
  • 30
  • 168
  • 299
Phinet
  • 527
  • 3
  • 14
  • 4
    It that really `i` or `a`, or were you lazy? LOL – vee Jul 16 '13 at 04:21
  • Don't mind me pressing wrong key when typing int question entry box ^^' 'i' should be 'a', it still messes up. – Phinet Jul 16 '13 at 04:23
  • Short answer being that floating-point arithmetic (i.e. arithmetic with non-integers) has a certain degree of inaccuracy, because computers in general only approximate the value of a floating point number in memory (unlike integers, which can be modeled to an exact value). – Richard Neil Ilagan Jul 16 '13 at 04:25

1 Answers1

0

If you're expecting no decimal then you have to parseInt().

for (var a=0;a<100;a++) {
    var b= 3.6*a;
    console.log( parseInt(b));
}
Praveen
  • 55,303
  • 33
  • 133
  • 164