0

I have this code:

double timeTillTick = 15.0;
private void lower_Tick(object sender, EventArgs e)
{
    if (timeTillClear > 0)
    {
        timeTillClear -= 0.1;
        clearingIn10SecondsToolStripMenuItem.Text = "Clearing in " + timeTillClear + " seconds.";
    }
    else
    {
        lower.Enabled = false;
    }
}

lower ticks once every 100 milliseconds. When it:

  • Gets to 8
  • Gets to 5
  • Gets to 1

it increases by 0.000000000000001. Why?

Ben T
  • 4,656
  • 3
  • 22
  • 22
Jon
  • 2,566
  • 6
  • 32
  • 52

1 Answers1

5

Much like 1/3 cannot be represented exactly using decimal notation (0.333333...), 0.1 cannot be represented exactly as floating point number, which is internally using binary notation (aka IEEE-754). That's why you get that inherent error.

mvp
  • 111,019
  • 13
  • 122
  • 148