I'm trying to do some performance measures in c++ by measuring the real elapsed time in milliseconds vs. the cpu time in milliseconds. This is how my code looks like:
auto start = std::chrono::high_resolution_clock::now();
unsigned begin = clock();
// some computationally expensive task
auto finish = std::chrono::high_resolution_clock::now();
unsigned end = clock();
(finish - start).count();
int duration = std::chrono::duration_cast<std::chrono::milliseconds>(finish - start).count();
int cpu_duration = 1000*(end - begin)/(CLOCKS_PER_SEC);
Now I would expect the cpu time value to be lower than the system time, because the thread might be interrupted. However, the cpu time is 2-3 times higher than the real time. Am I doing something wrong or am I misunderstanding the concept of cpu time?