After reading these questions:
Code is behaving differently in Release vs Debug Mode
C# - Inconsistent math operation result on 32-bit and 64-bit
Double precision problems on .NET
Why does this floating-point calculation give different results on different machines?
I suspect that the reason my method for determining FPS which works while in Debug mode and no longer works in Release mode is because I'm using Long to hold time values. Here's the relevant code:
public void ActualFPS()
{
if (Stopwatch.GetTimestamp() >= lastTicks + Stopwatch.Frequency)
{
actualFPS = runsThisSecond;
lastTicks = Stopwatch.GetTimestamp();
runsThisSecond = 0;
}
}
runsThisSecond is incremented by one every time the method I'm tracing is called. Granted this isn't an overly accurate way to determine FPS, but it works for what I need it to.
lastTicks is a variable of type Long, and I believe that Stopwatch.GetTimestamp() is returned as a Long as well(?). Is this my problem? If so: any suggestions as to how to work around this?
EDIT: Stopwatch is using the High Resolution timer.
EDIT2: The problem has resolved itself. Without any changes to any of my code. At all. None. I have no idea what caused it to break, or to fix itself. Perhaps my computer decided to spontaneously consider my feelings?