Would there be any reason not to use differences between SystemClock.elapsedRealtimeNanos()
collected far apart to assess the amount of time that passed in between?
I.e., is SystemClock.elapsedRealtimeNanos()
as accurate as System.currentTimeMillis()
?
Or, does the precision come at a tradeoff of accuracy? I've noticed scattered reports of drift without clear answers.
But, the Android documentation doesn't indicate this should be the case.
elapsedRealtime()
andelapsedRealtimeNanos()
return the time since the system was booted, and include deep sleep. This clock is guaranteed to be monotonic, and continues to tick even when the CPU is in power saving modes, so is the recommend basis for general purpose interval timing.
I'm currently running a test while debugging/plugged-in. I can see tomorrow whether any noticeable drift happened, but even that won't be conclusive without some more extensive testing, so I was hoping possibly other people can confirm what can be expected, also across phones/manufacturers of these timers.