The code below:
function lookForMatches()
{
for (var a=1;a<=2000;a++)
{
for (var b=1;b<=a;b++)
{
c=a*a*a+b*b*b;
d=5*c*c;
e=d-4;
f=d+4;
g=Math.floor(Math.sqrt(e))*Math.floor(Math.sqrt(e))
h=Math.floor(Math.sqrt(f))*Math.floor(Math.sqrt(f))
if (g==e||h==f)
document.getElementById("matches").innerHTML+=c+" ";
}
}
document.getElementById("matches").innerHTML+="Done."
}
is supposed to check for Fibonacci numbers in the sequence of the sums of positive cubes, but the output is:
2 5075441272 2976917960 7114243176 Done.
I ran the Fibonacci test manually in Windows Calculator, and although the square root of (5*50754412722±4) is extremely close to an integer value, it's definitely not an integer! I'm not even certain if the number 5075441272 is actually a valid sum of cubes. How large can integer values be in Javascript without introducing roundoff error?