If JavaScript's Number and C#'s double are specified the same (IEEE 754), why are numbers with many significant digits handled differently?
var x = (long)1234123412341234123.0; // 1234123412341234176 - C#
var x = 1234123412341234123.0; // 1234123412341234200 - JavaScript
I am not concerned with the fact that IEEE 754 cannot represent the number 1234123412341234123. I am concerned with the fact that the two implementations do not act the same for numbers that cannot be represented with full precision.
This may be because IEEE 754 is under specified, one or both implementations are faulty or that they implement different variants of IEEE 754.
This problem is not related to problems with floating point output formatting in C#. I'm outputting 64-bit integers. Consider the following:
long x = 1234123412341234123;
Console.WriteLine(x); // Prints 1234123412341234123
double y = 1234123412341234123;
x = Convert.ToInt64(y);
Console.WriteLine(x); // Prints 1234123412341234176
The same variable prints different strings because the values are different.