4

As far as the language definition is concerned, JavaScript numbers are 64-bit floating-point.

(Except for bitwise operations, which use 32-bit integers. I suppose the latter is mandated even on a 64-bit CPU, e.g. 1 << 33 has to be 2 even if the CPU could do better, for backwards compatibility.)

However, if a compiler can prove a number is used only as an integer, it may prefer to implement it as such for efficiency, e.g.

for (var i = 0; i < Math.pow(2, 40); i++)
    console.log(i)

Clearly it is desirable to implement this with integers, in which case 64-bit integers must be used for correctness.

Now consider this case:

for (var i = 0; i < Math.pow(2, 60); i++)
    console.log(i)

If implemented with floating-point numbers, the above will fail, as floating-point cannot accurately represent integers larger than fifty-three bits.

If implemented with 64-bit integers, it works fine (well, apart from the inconveniently long run time).

Is a JavaScript compiler allowed (both by the letter of the standard and by compatibility with actual existing code) to use 64-bit integers in such cases where they provide different but better results than floating point?

Similarly, if a JavaScript compiler provides arrays with more than four billion elements, is it allowed to implement array lengths and indexes as 64-bit integers?

rwallace
  • 31,405
  • 40
  • 123
  • 242
  • http://stackoverflow.com/a/9643650/588973 https://developer.mozilla.org/en-US/docs/Mozilla/js-ctypes/js-ctypes_reference/UInt64 https://github.com/broofa/node-int64 – Deele Jul 14 '14 at 04:26

1 Answers1

1

MDN information about UInt64: As JavaScript doesn't currently include standard support for 64-bit integer values, js-ctypes offers the Int64 and UInt64 objects to let you work with C functions and data that need (or may need) to use data represented using a 64-bit data type.

You use the UInt64 object to create and manipulate 64-bit unsigned integers.

user3611630
  • 137
  • 7