0

My professor has a program that takes a seed and function and recursively outputs and inputs a value from there.

The basic idea is that you have some f(x) and you start with an x_0, so that f(x_0) = x_1. Then you have that x_2 = f(x_1) and so on such that x_n = f(x_{n-1}).

Anyways, you can sometimes get cycles like this. For example let f(x) = 2x mod 1.

In a program, you can input 0.2 and expect a cycle: (0.2, 0.4, 0.3, 0.6, 0.2, ...)

But eventually his program does something weird.. around the 50th iteration, you get a 0.20001 where you would expect a 0.2, then because of that, the program breaks out of the cycle and reaches 1, and then you get all outputs of 0 after that.

I assume this has something to do with how computers approximate values, but I just don't get why it would eventually calculate 0.20001 instead of 0.2. Can someone give me a technical explanation of this?

Steady
  • 33
  • 1
  • 1
  • 6
  • Floating point are frequently approximations....start here http://stackoverflow.com/questions/7644699/floating-point-representation – Ottak Sep 14 '14 at 22:53

1 Answers1

1

Floating-point arithmetic is deterministic(*).

You think that you are seeing an initial periodic phase which then changes to something else (“0.2, 0.4, 0.3, 0.6, 0.2”), but if you printed the values with enough precision (use the format %.16e if you are using C or a C-like language), you would see that the first “0.2” is not the same as the second “0.2”.

If any value really repeated during the first phase, then the sequence would effectively be periodic and all other values would repeat ad infinitum, just as you could expect.


(*) Unless you are programming in a language with just-in-time compilation and crappy floating-point semantics. Java can be compiled just-in-time but the meaning of floating-point operations is defined quite strictly, and the operations must implement this definition before and after just-in-time compilation, leaving no room for weird behavior like this. C#'s definition of floating-point operations is weird enough that a program could change behavior after a while because of just-in-time compilation, but that is probably the only widely used language with such horrible properties.

Pascal Cuoq
  • 79,187
  • 7
  • 161
  • 281