Im struggling to understand why a calculation in c# is displaying a different result in a console window compared to the result in debug.
The following code displays 0.1647513 in the console 0.1647513 and in debug 0.164751321
var v = (float) (0.172731235 + -0.200918653*0.04 + 0.03552264*0.04*0.04);
Console.WriteLine(v);
Console.ReadKey();
Can anyone explain this please
Thanks