I'm trying to measure the total time of a one-way trip UDP multicast. I have a device sending out a message on the multicast group every half second, and I have another device listening. I want to measure how long that takes, so I had the idea of taking a timestamp on the send side with time.time(), sending it as the message, and then subtracting that from the timestamp at the other end when the message is received.
However, I'm getting some negative numbers. I think that some of this has to do with decimal accuracy, which is something I'm still figuring out, but it is making me ask the question:
How globally accurate is time.time()? Will time.time() be the same down to the millisecond or microsecond for the same instant on two different devices?
Some code:
Sender:
import time
import decimal
[... socket setup ...]
current = str(decimal.Decimal(time.time()))
socket.sendto(current, ...)
Receiver:
import time
[... socket setup ...]
stamp = sock.recv(10240)
current = time.time()
diff = current - stamp
# stamp should be positive...
Edit: this is not a duplicate. That high-precision question has to do with relative precision, I'm asking about global precision (precision between devices). A device may be precise to the microsecond relative to itself, but be on a different second than another machine.
Edit 2: It seems that my actual question is, how globally accurate are Unix clocks?