My question is simple.
1/3 = 0.33333..... (recurring) 1/3 + 1/3 + 1/3 = 1 (in pure mathematics)
Actually
1/3 + 1/3 + 1/3 = 0.99999.... (recurring)
My questions are
1]. How programmers eliminate this error ?
2]. Is this error elimination named something ?
3]. 1/3 + 1/3 + 1/3 = 0.9999.... is correct. 1/3 + 1/3 + 1/+3 = 1 is logic less. How a logic less operation is carried away in programming ?
This question is asked in mathematics community in math.stackexchange , if you are a mathematician and programmer please try to answer that question also.