I have the following piece of code:
#include <iostream>
using namespace std;
int main() {
double value = 0;
double gain = 0.01;
double offset = -500;
value -= offset;
unsigned long raw = value / gain;
cout << raw;
return 0;
}
On my Windows machine with MinGW 5.0 and gcc 4.7.3 the console output of this programm is 49999 instead of 50000. Using some random online IDE (https://ideone.com/uDhPFM) as well as my linux machine, the result is 50000 as expected.
Why is that so?
EDIT: On both, Windows and Linux, I am using the default installation of CLion to run the program.
UPDATE:
#include <iostream>
using namespace std;
int main() {
double value = 0;
double gain = 0.01;
double offset = -500;
value = value - offset;
double rawDouble = value / gain;
unsigned long rawInt = value / gain;
cout << rawDouble << endl;
cout << rawInt << endl;
cout << setiosflags(ios::fixed) << setprecision(24) << rawDouble << endl;
cout << setiosflags(ios::fixed) << setprecision(24) << (value / gain) << endl;
return 0;
}
Using this code the output is
50000
49999
50000.000000000000000000000000
50000.000000000000000000000000
Only the division in direct relation to the assignment / inplicit cast seems to fail.