0

I was just playing with some of the timer functions to estimate the real time and cpu time consumed by a code using clock() and gtod(). In other case, I simply type time ./object file name. However, I was surprised to see the big differences (in one case 51 ms, other like 82 ms) in both the techniques. So, I was wondering when you estimate cpu time and real time, which method is more likely the accurate one?

Thanks.

beginner
  • 411
  • 4
  • 14
  • It isn't entirely clear from your question, but one difference may be the start-up overhead of running the program (which would be counted by `time`). If the function calls are inside the program, then they skip the start-up overhead because they don't execute until the program is running. Also, at a few milliseconds total time, scheduling and other processes running on the system can easily interfere with the timing. You'd need to run the processes multiple time to get consistent timing. – Jonathan Leffler Jul 10 '15 at 03:41
  • Thanks Jonathan. My question was basically I was trying to estimate cpu time and real time using two methods. One using gtod() , clock(), and using readTSC() for Intel processor and the other method just simply running the time command. However, I got two different results in both the cases, while the first method gave me 51 ms cpu time, second one gave me 82 ms. So, I was wondering which of these methods were correct? – beginner Jul 10 '15 at 05:53
  • Is the code using `gtod()`, `clock()` and `readTSC()` also running the program, or is that timing code embedded in the program you are measuring. The difference is critical, as I outlined in my previous comment. Either or neither is correct; it depends on what you want to measure. I'm not familiar with `gtod()`, so I don't know what it does at all. – Jonathan Leffler Jul 10 '15 at 05:55
  • The timing code using those function is embedded in the program I am measuring. well, gtod() is defined as - volatile double gtod(void) { static struct timeval tv; static struct timezone tz; gettimeofday(&tv,&tz); return tv.tv_sec + 1.e-6*tv.tv_usec; }*/ if I want to estimate the time of a code, I simply write like this t=gtod(); then the code section; then t_real=gtod()-t ; So, this is what good()does, just estimates the timestamp. – beginner Jul 10 '15 at 06:00
  • The three functions, `gtod()`, `clock()` and `readTSC()` are measuring three different things. They're all run from within the program, so by definition they cannot measure the time it takes to start the program. The `time` command does run the program and takes into account the start-up (and shut-down) time. You need to decide what you are trying to measure. No-one can tell you what is accurate if you don't say what you want to measure. All four methods are accurate, even when they give different answers. They're subject to different errors. They have different resolutions. – Jonathan Leffler Jul 10 '15 at 06:04
  • Incidentally, asking about `gtod()` when it is a function you wrote and not explaining what it is or showing its code in the question is really almost rude. How are we supposed to guess what it is and what it does. Your question is the only useful thing that a Google search for 'gtod time' pulls up — everything else seems to be from 'god time'. – Jonathan Leffler Jul 10 '15 at 06:13
  • Thanks for your reply. Now I got the issue. Ohh, I am sorry about that. I didn't pay attention towards describing the functions. – beginner Jul 10 '15 at 06:40

0 Answers0