So, I started out by trying to find the 100th Fibonacci number using a recursive function and by memoizing the function using the following code.
Function.prototype.memoize = function () {
var originalFunction = this,
slice = Array.prototype.slice;
cache = {};
return function () {
var key = slice.call(arguments);
if (key in cache) {
return cache[key];
} else {
return cache[key] = originalFunction.apply(this, key);
}
};
};
var fibonacci = function (n) {
return n === 0 || n === 1 ? n : fibonacci(n - 1) + fibonacci(n - 2);
}.memoize();
console.log(fibonacci(100));
Now, as you can see in this fiddle, JavaScript logs 354224848179262000000 as the result. The hundredth Fibonacci number is actually 354224848179261915075 according to WolframAlpha which is correct.
Now, my question is this. Why does the number compute incorrectly, even though the algorithm is completely sane? My thoughts point to JavaScript because according to Google's calculator1, the two numbers are equal.
What is it about JavaScript that causes such an error? The number is safely within limits of the maximum value of an IEEE 754 number, which is 1.7976931348623157e+308.
1In case this could be a bug on my platform, I have tested this on both Chromium and Firefox on Ubuntu.