42

Some of my data are 64-bit integers. I would like to send these to a JavaScript program running on a page.

However, as far as I can tell, integers in most JavaScript implementations are 32-bit signed quantities.

My two options seem to be:

  1. Send the values as strings
  2. Send the values as 64-bit floating point numbers

Option (1) isn't perfect, but option (2) seems far less perfect (loss of data).

How have you handled this situation?

Frank Krueger
  • 69,552
  • 46
  • 163
  • 208

6 Answers6

29

There is in fact a limitation at JavaScript/ECMAScript level of precision to 53-bit for integers (they are stored in the mantissa of a "double-like" 8 bytes memory buffer). So transmitting big numbers as JSON won't be unserialized as expected by the JavaScript client, which would truncate them to its 53-bit resolution.

> parseInt("10765432100123456789")
10765432100123458000

See the Number.MAX_SAFE_INTEGER constant and Number.isSafeInteger() function:

The MAX_SAFE_INTEGER constant has a value of 9007199254740991. The reasoning behind that number is that JavaScript uses double-precision floating-point format numbers as specified in IEEE 754 and can only safely represent numbers between -(2^53 - 1) and 2^53 - 1.

Safe in this context refers to the ability to represent integers exactly and to correctly compare them. For example, Number.MAX_SAFE_INTEGER + 1 === Number.MAX_SAFE_INTEGER + 2 will evaluate to true, which is mathematically incorrect. See Number.isSafeInteger() for more information.

Due to the resolution of floats in JavaScript, using "64-bit floating point numbers" as you proposed would suffer from the very same restriction.

IMHO the best option is to transmit such values as text. It would be still perfectly readable JSON content, and would be easy do work with at JavaScript level.

A "pure string" representation is what OData specifies, for its Edm.Int64 or Edm.Decimal types.

What the Twitter API does in this case, is to add a specific ".._str": field in the JSON, as such:

{
   "id": 10765432100123456789,           // for JSON compliant clients
   "id_str": "10765432100123456789",     // for JavaScript
    ...
}

I like this option very much, since it would be still compatible with int64 capable clients. In practice, such duplicated content in the JSON won't hurt much, if it is deflated/gzipped at HTTP level.

Once transmitted as string, you may use libraries like strint – a JavaScript library for string-encoded integers to handle such values.

Update: Newer versions of JavaScript engines include a BigInt object class, which is able to handle more than 53-bit. In fact, it can be used for arbitrarily large integers, so a good fit for 64-bit integer values. But when serializing as JSON, the BigInt value will be serialized as a JSON string - weirdly enough, but for compatibility purposes I guess.

Arnaud Bouchez
  • 42,305
  • 3
  • 71
  • 159
  • 2
    Great explanation. I just encountered this, with a JS frontend and a C# web api back end. My HttpResponseMessage would serialize my objects (which contained long id's) and they were being truncated. What was really odd, is they weren't truncated until they reached the browser. If I created the response, then parsed the content before shipping it off, the ID's were not truncated. It wasn't until the browser received and parsed the response that it would magically get truncated. Your explanation helped me understand why. Thanks! I ended up registering a custom converter for long types. – Kris Coleman Apr 28 '17 at 15:04
26

This seems to be less a problem with JSON and more a problem with Javascript itself. What are you planning to do with these numbers? If it's just a magic token that you need to pass back to the website later on, by all means simply use a string containing the value. If you actually have to do arithmetic on the value, you could possibly write your own Javascript routines for 64-bit arithmetic.

One way that you could represent values in Javascript (and hence JSON) would be by splitting the numbers into two 32-bit values, eg.

  [ 12345678, 12345678 ]

To split a 64-bit value into two 32-bit values, do something like this:

  output_values[0] = (input_value >> 32) & 0xffffffff;
  output_values[1] = input_value & 0xffffffff;

Then to recombine two 32-bit values to a 64-bit value:

  input_value = ((int64_t) output_values[0]) << 32) | output_values[1];
Simon Howard
  • 8,999
  • 5
  • 28
  • 21
  • 1
    When I try "input_value >> 32", I just get back "input_value", as if the 32 bits are just wrapping fully around. ">>>" has the same effect. Yet when I try "Math.floor(input_value / 4294967296)" (4294967296 = Math.pow(2, 32)), I get the high 32bit part of "input_value"; What am I missing? – snapfractalpop Jul 04 '12 at 15:49
  • You don't say what language you're using, so I'm going to assume you're using Java. Is 'input_value' a long (ie. 64-bit integer)? If it's an int (32-bit integer), then bitshifting left 32-bits won't work. – Simon Howard Jul 05 '12 at 18:35
  • sorry about that.. I should have been explicit. I am using javascript (in nodejs) and was actually surprised to see that bitshifting 32 bits to the right somehow gave me the same number. I don't entirely understand all the implicit type conversions in javascript and their underlying representations. – snapfractalpop Jul 06 '12 at 03:46
  • 1
    It still *is* a JSON problem, not a Javascript problem, in that JSON does not support integers with more than 9 decimal digits. If it did, the Javascript implementation would be forced to provide support for it somehow under the hood. – Lightness Races in Orbit Sep 22 '12 at 13:18
  • 2
    @LightnessRacesinOrbit, I can't seem to find a reference to the 9 decimal digits statement. Is that in some sort of a spec? – akhaku Mar 29 '13 at 21:28
  • 4
    @akhaku: It's in the version of the spec (RFC 4627 & www.json.org) that I invented in my head when I misinterpreted the grammar. :) – Lightness Races in Orbit Mar 30 '13 at 14:15
7

Javascript's Number type (64 bit IEEE 754) only has about 53 bits of precision.

But, if you don't need to do any addition or multiplication, then you could keep 64-bit value as 4-character strings as JavaScript uses UTF-16.

For example, 1 could be encoded as "\u0000\u0000\u0000\u0001". This has the advantage that value comparison (==, >, <) works on strings as expected. It also seems straightforward to write bit operations:

function and64(a,b) {
    var r = "";
    for (var i = 0; i < 4; i++)
        r += String.fromCharCode(a.charCodeAt(i) & b.charCodeAt(i));
    return r;
}
David Leonard
  • 1,694
  • 1
  • 17
  • 14
  • 1
    That's a clever way of staying frugal with bytes over the wire, while still being able to more or less read the number. I'd be interested to know if there are any gotchas with this approach. – acjay Nov 10 '17 at 13:30
2

The JS number representation is a standard ieee double, so you can't represent a 64 bit integer. iirc you get maybe 48 bits of actual int precision in a double, but all JS bitops reduce to 32bit precision (that's what the spec requires. yay!) so if you really need a 64bit int in js you'll need to implement your own 64 bit int logic library.

olliej
  • 35,755
  • 9
  • 58
  • 55
0

JSON itself doesn't care about implementation limits. your problem is that JS can't handle your data, not the protocol. In other words, your JS client code has to use either of those non-perfect options.

Vinnie Falco
  • 5,173
  • 28
  • 43
Javier
  • 60,510
  • 8
  • 78
  • 126
0

This thing happened to me. All hell broke loose when sending large integers via json into JSON.parse. I spent days trying to debug. Problem immediately solved when i transmitted the values as strings.

Use { "the_sequence_number": "20200707105904535" } instead of { "the_sequence_number": 20200707105904535 }

To make it worse, it would seem that where every JSON.parse is implemented, is some shared lib between Firefox, Chrome and Opera because they all behaved exactly the same. Opera error messages have Chrome URL references in it, almost like WebKit shared by browsers.

console.log('event_listen[' + global_weird_counter + ']: to be sure, server responded with [' + aresponsetxt + ']');
var response = JSON.parse(aresponsetxt);
console.log('event_listen[' + global_weird_counter + ']: after json parse: ' + JSON.stringify(response));

The behaviour i got was the sort of stuff where pointer math went horribly bad. Ghosts were flying out of my workstation wreaking havoc in my sleep. They are all exorcised now that i switched to string.