5

Is there any way of obtaining high-precision system time in python?

I have a little application to work with virtual COM-port. I want to measure the time interval between the sending of the message and its receiving.

At the moment it works like this:

I obtain the message, use

    time.time()

and append its 20 digits to the message. The client application receives this message and gets

    time.time()

again, then calculates their differences. At the most of the cases the time interval (as i expected) equals zero.

The question is: is there any way of doing this in more intelligent way and with more precision?

mr_borsch
  • 81
  • 3
  • If the sender and receiver are on different machines, how do you handle the different timebases? In any case, the serial port driver will send you the message whenever it feels like it, which is wildly variable (10's of milliseconds, in my experience). – mtrw Dec 05 '11 at 17:35
  • the sender and the receiver are on the same machine(i am using virtual COM-ports). the program is written for educational purposes, so i am asking if there is an intelligent way to solve this problem – mr_borsch Dec 05 '11 at 17:39
  • related: http://bugs.python.org/issue10278 [time.walltime() in C](http://bugs.python.org/review/10278/patch/3730/11849), [in Python call clock_gettime() using ctypes](http://stackoverflow.com/a/1205762/4279) – jfs Dec 05 '11 at 18:01

1 Answers1

2

Here is an excerpt from time.clock

On Unix, return the current processor time as a floating point number expressed in seconds. The precision, and in fact the very definition of the meaning of “processor time”, depends on that of the C function of the same name, but in any case, this is the function to use for benchmarking Python or timing algorithms.

On Windows, this function returns wall-clock seconds elapsed since the first call to this function, as a floating point number, based on the Win32 function QueryPerformanceCounter(). The resolution is typically better than one microsecond.

(emphasis mine)

Brigand
  • 84,529
  • 20
  • 165
  • 173
  • thanks a lot for your answer, but the client and the sender applications work as different processes. so, how can I use this function in order to calculate the time interval? – mr_borsch Dec 05 '11 at 17:28
  • I really couldn't say. The best chance would be having the ```send``` block until the message is received, so you can catch the time before and after. One way to do this would be expecting a reply but that would influence your results. The other option is sending a very long message that would take minutes to transmit, so that ```time.time()``` would have enough precision. I'm really hoping someone finds a better solution... – Brigand Dec 05 '11 at 17:41
  • Calibrate the QueryPerformanceCounter against time.time at the start of your code, so the timestamp you send will be a high precision version of time.time(). Worth a shot! – Nick Craig-Wood Dec 05 '11 at 17:52