I know why 0.1 + 0.2 !== 0.3
, because 0.1 cannot be represented exactly in a binary floating point representation, but why 0.1 + 0.3 === 0.4
in JavaScript? I think 0.1, 0.3 both cannot be represented exactly, why 0.1 + 0.3
will exactly equal 0.4? and I test this in C,I got 0.40000000000000002
, this result is what I want, here is my C code
#include "stdio.h"
int main(){
double a = 0.1 + 0.3;
printf( "%.17lf", a );
}
thank you every much!