I'm assigning the value of unistd.h
's clock()
to two int
types, as follows:
int start_time = clock();
for (i = 0; i < 1000000; i++) {
printf("%d\n", i+1);
}
int end_time = clock();
However, when I print their values, the actual time elapsed differs from the time displayed. The POSIX standard declares that CLOCKS_PER_SEC
must equal one million, assuming that a clock cycle is a microsecond. Is the clock just not going the speed the standard expects, or is my loop causing some weirdness in the calculation?
I'm trying to measure the speed of different operations in a similar fashion, and an inaccurate clock ruins my experiments.