I currently use a solution for getting a higher resolution timestamp in C# by taking a start time using DateTime.UtcNow and then using a Stopwatch to add ticks to it as time goes by. I came across Stopwatch.GetTimestamp() as a potential alternative or even better solution, but I cannot find reliable information on exactly what this function returns.
Best source of info seems to be this.
GetTimestamp() returns machine-dependent ticks which can be converted into seconds by dividing by the stopwatch frequency. If I do this, I get a value that appears to be a UTC UNIX timestamp which is exactly what I'm after - but I haven't seen anything that states that this is what I should expect from it.
One clue from MSDN states that:
If the Stopwatch class uses a high-resolution performance counter, GetTimestamp returns the current value of that counter. If the Stopwatch class uses the system timer, GetTimestamp returns the current DateTime.Ticks property of the DateTime.Now instance.
Looking then at DateTime.Ticks, we then see:
The value of this property represents the number of 100-nanosecond intervals that have elapsed since 12:00:00 midnight, January 1, 0001 (0:00:00 UTC on January 1, 0001, in the Gregorian calendar), which represents DateTime.MinValue.
I'm therefore not clear how simply dividing some machine-dependent tick-count by the frequency can get me a UNIX 1970+ timestamp? Is it possible that if a high performance timer is not available on the target platform that I might get year 0001-based timestamp instead? Or maybe something else entirely, again depending on the available hi-res timer?