Compiling this code using Emscripten:
#include <stdio.h>
int main() {
unsigned long d1 = 0x847c9b5d;
unsigned long q = 0x549530e1;
printf("%lu\n", d1*q);
return 0;
}
yields (using -g
):
$d1=-2072208547; //@line 3 "minusmul.c"
$q=1419063521; //@line 4 "minusmul.c"
var $2=$d1; //@line 5 "minusmul.c"
var $3=$q; //@line 5 "minusmul.c"
var $4=((($2)*($3))|0); //@line 5 "minusmul.c"
Executing this using js
(SpiderMonkey I believe?) or node
, I get the result 3217488896
. Executing the native executable (compiled using GCC), I get 3217489085
. How would one emulate the x86 unsigned 32-bit integer multiplication using JavaScript?