My professor has a program that takes a seed and function and recursively outputs and inputs a value from there.
The basic idea is that you have some f(x) and you start with an x_0, so that f(x_0) = x_1. Then you have that x_2 = f(x_1) and so on such that x_n = f(x_{n-1}).
Anyways, you can sometimes get cycles like this. For example let f(x) = 2x mod 1.
In a program, you can input 0.2 and expect a cycle: (0.2, 0.4, 0.3, 0.6, 0.2, ...)
But eventually his program does something weird.. around the 50th iteration, you get a 0.20001 where you would expect a 0.2, then because of that, the program breaks out of the cycle and reaches 1, and then you get all outputs of 0 after that.
I assume this has something to do with how computers approximate values, but I just don't get why it would eventually calculate 0.20001 instead of 0.2. Can someone give me a technical explanation of this?