I'm writing a script in Python that returns a list of terms in the Fibonacci sequence, given a start term and an end term. For example, if I entered "0" as the start term and "6" as the end term, then the output should be:
[0, 1, 1, 2, 3, 5, 8]
Oddly, when I ran the program, the output was:
[0.0, 1.0, 1.0, 2.0, 3.0000000000000004, 5.000000000000001, 8.000000000000002]
To calculate the terms of the sequence, I used Binet's formula, which I entered as ((1 + math.sqrt(5))**x -(1 - math.sqrt(5))**x) / (2**x * math.sqrt(5)))
. I entered the same formula into a few other calculators to see if they'd give me decimal answers, and none of them did. Did I somehow mistype the formula, or is Python miscalculating it?