10

Given the following code, where both a and b are Numbers representing values within the range of signed 32-bit signed integers:

var quotient = ((a|0) / (b|0))|0;

and assuming that the runtime is in full compliance with the ECMAScript 6 specifications, will the value of quotient always be the correct signed integer division of a and b as integers? In other words, is this a proper method to achieve true signed integer division in JavaScript that is equivalent to the machine instruction?

PM 77-1
  • 12,933
  • 21
  • 68
  • 111
lcmylin
  • 2,552
  • 2
  • 19
  • 31
  • 1
    Have you tried looking for any counterexamples that might prove it *isn't* always correct? – Purag Jul 11 '15 at 05:33
  • 1
    I have not. Since JavaScript formally deals in all floating point, I would see the question coming down to, is the result of the double precision division of two mathematical integers in the range of a 32-bit signed integer, followed by a truncation to a 32-bit signed integer and a simulation of overflow according to the `ToInt32()` abstract operation specified by EMCAScript 6, equivalent to an integer division of the same mathematical values? I don't feel I understand the process of floating point division well enough to answer this myself or derive counterexamples, which is why I asked here. – lcmylin Jul 11 '15 at 06:01
  • 1
    For division by zero, no — it returns 0 instead of throwing an exception: https://stackoverflow.com/questions/29179876/how-does-asm-js-handle-divide-by-zero – gengkev Jul 11 '15 at 06:35
  • [`Math.prototype.trunc`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Math/trunc) might be useful – royhowie Jul 11 '15 at 07:19

1 Answers1

5

I'm no expert on floating-point numbers, but Wikipedia says that doubles have 52 bits of precision. Logically, it seems that 52 bits should be enough to reliably approximate integer division of 32-bit integers.

Dividing the minimum and maximum 32-bit signed ints, -2147483648 / 2147483647, produces -1.0000000004656613, which is still a reasonable amount of significant digits. The same goes for its inverse, 2147483647 / -2147483648, which produces -0.9999999995343387.

An exception is division by zero, which I mentioned in a comment. As the linked SO question states, integer division by zero normally throws some sort of error, whereas floating-point coercion results in (1 / 0) | 0 == 0.

Update: According to another SO answer, integer division in C truncates towards zero, which is what |0 does in JavaScript. In addition, division by 0 is undefined, so JavaScript is technically not incorrect in returning zero. Unless I've missed anything else, the answer to the original question should be yes.

Update 2: Relevant sections of the ECMAScript 6 spec: how to divide numbers and how to convert to a 32-bit signed integer, which is what |0 does.

gengkev
  • 1,890
  • 2
  • 20
  • 31