Let me ask my question by this test program:
#include <iostream>
#include <chrono>
using std::chrono::nanoseconds;
using std::chrono::duration_cast;
int main(int argc, char* argv[])
{
std::cout
<< "Resolution (nano) = "
<< (double) std::chrono::high_resolution_clock::period::num /
std::chrono::high_resolution_clock::period::den *
1000 * 1000 * 1000
<< std::endl;
auto t1 = std::chrono::high_resolution_clock::now();
std::cout << "How many nanoseconds does std::cout take?" << std::endl;
auto t2 = std::chrono::high_resolution_clock::now();
auto diff = t2-t1;
nanoseconds ns = duration_cast<nanoseconds>(diff);
std::cout << "std::cout takes " << ns.count() << " nanoseconds"
<< std::endl;
return 0;
}
Output on my machine:
Resolution (nano) = 100
How many nanoseconds does std::cout take?
std::cout takes 1000200 nanoseconds
I receive either 1000200
or 1000300
or 1000400
or 1000500
or 1000600
or 2000600
as a result (= 1 or 2 microsecond). Obviously, either the resolution of std::chrono
is not 100 nano-seconds or the way I measure the time of std::cout
is wrong. (Why do I never receive something between 1 and 2 microseconds, for example 1500000
?)
I need a high-resolution timer in C++. The OS itself provides a high-resolution timer, because I'm able to measure things with microsecond-precision using the C# Stopwatch
class on the same machine. So I would just need to correctly use the high-resolution timer that the OS has!
How do I fix my program to produce the expected results?