The timeGetTime function documentation says:
The default precision of the timeGetTime function can be five milliseconds or more, depending on the machine. You can use the timeBeginPeriod and timeEndPeriod functions to increase the precision of timeGetTime.
So the precision is system-dependent. But what if I don't want to increase the precision, I just want to know what it is in the current system. Is there a standard way (e.g. an API) to get it? Or should I just poll timeGetTime
for a while and look at what comes out and deduce from there?