1

Im struggling to understand why a calculation in c# is displaying a different result in a console window compared to the result in debug.

The following code displays 0.1647513 in the console 0.1647513 and in debug 0.164751321

        var v = (float) (0.172731235 + -0.200918653*0.04 + 0.03552264*0.04*0.04);
        Console.WriteLine(v);
        Console.ReadKey();

Can anyone explain this please

Thanks

Steve Parry
  • 366
  • 3
  • 12
  • 6
    Well the code you've given doesn't display *anything* in the console. It's not clear what you mean by "in debug" either. – Jon Skeet Nov 19 '15 at 14:49
  • This link can help : http://stackoverflow.com/questions/1421520/formatting-doubles-for-output-in-c-sharp – Tristan Djahel Nov 19 '15 at 14:52
  • Sorry, ive updated the code. What I mean by debug is when I breakpoint the code and evaluate the variable "v" – Steve Parry Nov 19 '15 at 14:56
  • 1
    The debugger just shows more digits. For `float` those are irrelevant as they are beyond the accurate range of the data type here. Keep in mind that this is binary floating-point, not decimal, so the decimal representation is almost always much longer, but without adding any accuracy – Joey Nov 19 '15 at 15:01

1 Answers1

2

I think the answer is here: C# float.ToString Rounding Values

Basically, a float by default holds 7 digits (what you're seeing in the output) - but it can also hold 9 (what you're seeing when you break execution and take a look).

Try Console.WriteLine(v.ToString("G9"); and see if it prints the same

Community
  • 1
  • 1
simonalexander2005
  • 4,338
  • 4
  • 48
  • 92
  • Thanks for the answer simonalexander2005, although I still don't understand the benefit of why displaying the result is different to the actual value. – Steve Parry Nov 19 '15 at 15:18