What's the relationship between the real CPU frequency and the clock_t (the unit is clock tick) in C?
Let's say I have the below piece of C code which measures the time that CPU consumed for running a for
loop.
But since the CLOCKS_PER_SEC is a constant value (basically 1000,000) in the C standard library, I wonder how the clock
function does measure the real CPU cycles that are consumed by the program while it runs on different computers with different CPU frequencies (for my laptop, it is 2.6GHz).
And if they are not relevant, how does the CPU timer work in the mentioned scenario?
#include <time.h>
#include <stdio.h>
int main(void) {
clock_t start_time = clock();
for(int i = 0; i < 10000; i++) {}
clock_t end_time = clock();
printf("%fs\n", (double)(end_time - start_time) / CLOCKS_PER_SEC);
return 0;
}