1

The link here says that gettimeofday() sets a structure which contains number of seconds and microseconds since Epoch (please tell me what Epoch is). With that thing in mind I set a structure before and after calling sleep function with parameter 3. So the total time difference setting of these structure is 3 seconds or 3000000 microseconds but it seem to give some wrong output. Where am I getting wrong?

#include<iostream>
#include<ctime>
#include<unistd.h>
#include<cstdio>
#include<sys/time.h>

using namespace std;

int main()
{
    struct timeval start,end;
    gettimeofday(&start,NULL);
    sleep(3);
    gettimeofday(&end,NULL);
    cout<<start.tv_usec<<endl;
    cout<<end.tv_usec<<endl;
    cout<<end.tv_usec-start.tv_usec;
    return 0;
}
7_R3X
  • 3,904
  • 4
  • 25
  • 43
  • "it seem to give some wrong output." Q: Would you be kind enough to *share* an example? – paulsm4 Mar 27 '16 at 05:35
  • `711721 711870 149` is one of the output and it keeps changing everytime I run it, even the difference of the two. @paulsm4 – 7_R3X Mar 27 '16 at 05:38
  • Suggestion: try [clock_gettime()](https://blog.habets.se/2010/09/gettimeofday-should-never-be-used-to-measure-time?gclid=CIn-h46U4MsCFQmqaQodLRwPqg) instead (for Linux). If Windows, try [QueryPerformanceCounter()](https://msdn.microsoft.com/en-us/library/windows/desktop/dn553408(v=vs.85).aspx). – paulsm4 Mar 27 '16 at 05:44
  • Tried using clock_gettime(). The difference of two I got now is 147767, which again is not equal to 3,000,000,000 nanoseconds and this too keeps changing on every other execution. Any idea @paulsm4 – 7_R3X Mar 27 '16 at 06:05
  • Q: Are you actually delaying 3 seconds? Q: What happens if you increase it to 10 seconds (which is easier to count). There are actually lots of reasons "sleep (3)" might not actually delay a full three seconds. The link I cited tells you how to create your own "sleep(), if you wish. – paulsm4 Mar 27 '16 at 06:44
  • @paulsm4 When I changed 3 to 10, `947291850 947441421 149571` was my output and yes it did wait for 10 secound before it displayed the output. – 7_R3X Mar 27 '16 at 07:01

1 Answers1

2

Here's the point you're missing:

unsigned long time_in_micros = 1000000 * tv_sec + tv_usec;

To get the elapsed time in microseconds, you need to ADD "seconds" to "microseconds". You can't just ignore the tv_sec field!

Sample code:

#include <unistd.h>
#include <stdio.h>
#include <time.h>
#include <sys/time.h>

int main(int argc, char *argv[])
{
    struct timeval start,end;
    gettimeofday(&start,NULL);
    sleep(3);
    gettimeofday(&end,NULL);
    printf ("start: %ld:%ld\n", start.tv_sec, start.tv_usec);
    printf ("end:   %ld:%ld\n", end.tv_sec, end.tv_usec);
    printf ("diff:  %ld:%ld\n",
      end.tv_sec-start.tv_sec, end.tv_usec-start.tv_usec);

    gettimeofday(&start,NULL);
    sleep(10);
    gettimeofday(&end,NULL);
    printf ("start: %ld:%ld\n", start.tv_sec, start.tv_usec);
    printf ("end:   %ld:%ld\n", end.tv_sec, end.tv_usec);
    printf ("diff:  %ld:%ld\n", 
      end.tv_sec-start.tv_sec, end.tv_usec-start.tv_usec);
    return 0;
}

Corresponding output:

start: 1459100430:214715
end:   1459100433:215357
diff:  3:642
start: 1459100433:215394
end:   1459100443:217024
diff:  10:1630

gettimeofday() links:

Community
  • 1
  • 1
paulsm4
  • 114,292
  • 17
  • 138
  • 190