I've been playing with Script#, and I was wondering how the C# numbers were converted to Javascript. I wrote this little bit of code
int a = 3 / 2;
and looked at the relevant bit of compiled Javascript:
var $0=3/2;
In C#, the result of 3 / 2
assigned to an int
is 1
, but in Javascript, which only has one number type, is 1.5
.
Because of this disparity between the C# and Javascript behaviour, and since the compiled code doesn't seem to compensate for it, should I assume that my numeric calculations written in C# might behave incorrectly when compiled to Javascript?