The calculation by itself isn't hard:
long UnixTime()
{
DateTime epochStart=new DateTime(1970, 1, 1, 0, 0, 0, DateTimeKind.Utc);
return (DateTime.UtcNow - epochStart).Ticks*100;
}
DateTime
and TimeSpan
internally store an integral amounts of ticks, with one tick being 100ns. I also specified the epoch start as UTC time, because I consider it ugly to subtract DateTime
s with different Kind
, even if it works.
But DateTime.UtcNow
has very low accuracy. It is only updated every few milliseconds(Typical values vary between 1ms and 16ms).
To get a constant framerate you could use StopWatch
since you don't need the absolute time. But if you go that way you must use a busy wait. Since Thread.Sleep
, timers,... suffer from the same limitation.
Alternatively you can use the timeBeginPeriod(1)
API, to force windows to update the clock and run timers every 1ms. But this is a global setting and increases power consumption. Still it's better than busy-wait.
To measure time differences you can use StopWatch
with is based on QueryPerformanceCounter
, but this comes with its own set of problems, such as desyncs between different cores. I've seen machines were QueryPerformanceCounter
jumped by several hundred Milliseconds when your thread gets scheduled on another core.