I am implementing some diagnostics for an application where the user can see how quickly the graphics are updating and I have encountered some seemingly strange behavior. My graphic update function is called using a Timer, and it looks like this:
private void RefreshScreen(object sender, EventArgs e)
{
DateTime begin = DateTime.Now;
// Do some updating...
DateTime end = DateTime.Now;
graphicsUpdateRate = (end - begin).TotalMilliseconds;
}
When I display the graphicsUpdateRate to a separate window later in another function, it is almost always 0. Occasionally it will actually be ~0.5, but I've never seen it below ~0.48. Is there some reason why it would clamp it to 0 if it were below this? Is there anything I'm missing when using TimeSpan.TotalMilliseconds? I'm rather confused by this since it seems so random.