I compiled this code using mingw32 4.4.1 on windows 7 64-bit using codeblocks. For Debian I used g++ 4.9.2.
#include <iostream>
#include <time.h>
#include <math.h>
#include <cstdlib>
using namespace std;
int main()
{
clock_t t1,t2;
t1=clock();
for (int i=0; i<=50000; i++)
{
cout << i << " ";
}
t2=clock();
float diff (((float)t2-(float)t1) / CLOCKS_PER_SEC);
cout<<"it took: "<<diff<<endl;
system ("pause");
return 0;
}
Windows XP 32-bit Virtual Machine: 3 times, mean time to generate was 7.656 sec
Windows 10 32-bit Virtual Machine: 3 times, mean time to generate was 16.446 sec
Debian 8.2 32-bit Virtual Machine: 3 times, mean time to generate was 0.0118 sec
How can this huge difference in time be explained specially between linux and windows ?
Kindly provide keywords and topic names that I could research and read to have a better in-depth understanding behind the reason, in addition to your explanation.