I'm studying computer hardware where we learn that using a hardware timer gets more accurate results than a software delay. I've written a software delay in assembly for 1 millisecond and I can start a process that repeats every millisecod using this delay and with a counter do something else every 100th millisecond, and this technique would be less accurate than using a hardware timer that I got builtin with my hardware that I'm going to use now.
So I wonder how accurate is the timing that is builtin with Java? We have System.currentTimeMillis
and Thread.sleep
and these might not use hardware timers, so how accurate are these builtin Java methods compared to a hardware timer?