I'm struggling with parsing a JSON string containing really large numbers.
Consider the following JSON string:
const jsonString = `{
"id": "eb2552e1673f4ad6bb70923e5920bbab",
"Int32": 2147483647,
"Int64": 9223372036854775807,
"Int64Array": [9223372036854775807, 9223372036854775806],
"Object": {
"Int321": 2147483647,
"Int641": 9223372036854775807,
"Int64Array1": [9223372036854775807, 9223372036854775806]
}
}`;
My objective is to create a JSON object out of this string.
With JSON.parse(jsonString)
, I get the following result:
{
id: 'eb2552e1673f4ad6bb70923e5920bbab',
Int32: 2147483647,
Int64: 9223372036854776000,
Int64Array: [ 9223372036854776000, 9223372036854776000 ],
Object:
{
Int321: 2147483647,
Int641: 9223372036854776000,
Int64Array1: [ 9223372036854776000, 9223372036854776000 ]
}
}
As you can see from above 9223372036854775807
and 9223372036854775806
change to 9223372036854776000
and that is really not acceptable.
Then I read about BigInt
and using the reviver
parameter in JSON.parse
and changed my code to something like:
const parsedJSON2 = JSON.parse(jsonString, (key, value) => {
if (typeof value === 'number') {
return BigInt(value).toString();
} else {
return value;
}
});
And this time the output is:
{
id: 'eb2552e1673f4ad6bb70923e5920bbab',
Int32: '2147483647',
Int64: '9223372036854775808',
Int64Array: [ '9223372036854775808', '9223372036854775808' ],
Object:
{
Int321: '2147483647',
Int641: '9223372036854775808',
Int64Array1: [ '9223372036854775808', '9223372036854775808' ]
}
}
This is somewhat better than the previous one, however here 9223372036854775807
and 9223372036854775806
change to 9223372036854775808
so it is still not accurate.
I am wondering if there's a better way to parse JSON containing really large numbers. I am completely fine converting these numbers into strings as long as I can have 100% fidelity in conversion.
Any help regarding this will be highly appreciated.