When I try to calculate a simple float value, I get the wrong result:
var a = 0.1111;
var b = 1 * 100;
alert(b); // Returns 11.110000000000001
Why it happens is discussed at Is floating point math broken?, but how can you get around this problem in javacript?