I wrote the following program in C++ to measure how much time it will take to print to the default output stream in different ways:
#include <algorithm>
#include <chrono>
#include <iostream>
using namespace std;
using namespace std::chrono;
int main() {
// Get starting timepoint
auto start = high_resolution_clock::now();
for (int i=0;i<100000;i++)
{
cout << "Hello";
}
// Get ending timepoint
auto stop = high_resolution_clock::now();
// Get duration. Substart timepoints to
// get durarion. To cast it to proper unit
// use duration cast method
auto duration = duration_cast<microseconds>(stop - start);
cout << "Time taken by function: "
<< duration.count() << " microseconds" << endl;
return 0;
}
At the first round I ran it with: cout << "Hello" << endl;
and it took 147,570 microseconds.
At the second round I rant it with: cout << "Hello\n";
and took 128,543 microseconds.
Lastly, I ran it with: printf("Hello\n");
and it took 121,223 microseconds.
What caused this noticeable difference?
Note: I took the average from 10 tests for each one.