0

My code snippet as below.

int nStdWorkDays;
double dbStdWorkDaysMin;

nStdWorkDays = 21;
dbStdWorkDaysMin = nStdWorkDays * 0.9;

Here, I found the value of dbStdWorkDaysMin is 18.900000000000002 (other than 18.9) when I debugged and added a watch in Visual Studio.

The error resulted in that '18.9 < dbStdWorkDaysMin' is true!

I wonder why this is happening. What are the similar traps? How can we get the correct calculation result?

Thank you all in advance.

Johnson
  • 157
  • 3
  • 17

0 Answers0