Possible Duplicate:
Is JavaScript’s Floating-Point Math Broken?
While coding, I came across something weird in Javascript. I'm not sure it's a bug. Maybe I just don't know why it happens but it looks really weird to me, so I made a simplified example of it. Here it is.
I have this code:
var v = 0.01;
for(var i=0;i<21;i++){
if(i % 4 == 0 && i!=0){
v += 0.01;
}
}
What I'm expecting to be true:
v == 0.06;
What is actually true:
v == 0.060000000000000005;
Can someone explain why I get this instead of what I'm expecting? Thanks.