5

I see that JSON.stringify and JSON.parse are both sycnhronous.

I would like to know if there a simple npm library that does this in an asynchonous way .

Thank you

Ngend Lio
  • 325
  • 5
  • 11

3 Answers3

4

You can make anything "asynchronous" by using Promises:

function asyncStringify(str) {
  return new Promise((resolve, reject) => {
    resolve(JSON.stringify(str));
  });
}

Then you can use it like any other promise:

asyncStringfy(str).then(ajaxSubmit);

Note that because the code is not asynchronous, the promise will be resolved right away (there's no blocking operation on stringifying a JSON, it doesn't require any system call).

You can also use the async/await API if your platform supports it:

async function asyncStringify(str) {
  return JSON.stringify(str);
}

Then you can use it the same way:

asyncStringfy(str).then(ajaxSubmit);
// or use the "await" API
const strJson = await asyncStringify(str);
ajaxSubmit(strJson);

Edited: One way of adding true asynchrnous parsing/stringifying (maybe because we're parsing something too complex) is to pass the job to another process (or service) and wait on the response.

You can do this in many ways (like creating a new service that shares a REST API), I will demonstrate here a way of doing this with message passing between processes:

First create a file that will take care of doing the parsing/stringifying. Call it async-json.js for the sake of the example:

// async-json.js
function stringify(value) {
  return JSON.stringify(value);
}

function parse(value) {
  return JSON.parse(value);
}

process.on('message', function(message) {
  let result;
  if (message.method === 'stringify') {
    result = stringify(message.value)
  } else if (message.method === 'parse') {
    result = parse(message.value);
  }
  process.send({ callerId: message.callerId, returnValue: result });
});

All this process does is wait a message asking to stringify or parse a JSON and then respond with the right value. Now, on your code, you can fork this script and send messages back and forward. Whenever a request is sent, you create a new promise, whenever a response comes back to that request, you can resolve the promise:

const fork = require('child_process').fork;
const asyncJson = fork(__dirname + '/async-json.js');

const callers = {};

asyncJson.on('message', function(response) {
  callers[response.callerId].resolve(response.returnValue);
});

function callAsyncJson(method, value) {
  const callerId = parseInt(Math.random() * 1000000);
  const callPromise = new Promise((resolve, reject) => {
    callers[callerId] = { resolve: resolve, reject: reject };
    asyncJson.send({ callerId: callerId, method: method, value: value });
  });
  return callPromise;
}

function JsonStringify(value) {
  return callAsyncJson('stringify', value);
}

function JsonParse(value) {
  return callAsyncJson('parse', value);

}

JsonStringify({ a: 1 }).then(console.log.bind(console));
JsonParse('{ "a": "1" }').then(console.log.bind(console));

Note: this is just one example, but knowing this you can figure out other improvements or other ways to do it. Hope this is helpful.

Alex Santos
  • 2,900
  • 1
  • 19
  • 21
  • 1
    Thank you. But you are calling JSON.stringify in your Code.. Does it make the code asychronous ?? – Ngend Lio Oct 09 '17 at 13:03
  • What about parsing? – Mark Oct 09 '17 at 13:27
  • @Mark_M parsing will be the same thing, just change `stringify` with `parse`. – Alex Santos Oct 09 '17 at 15:29
  • 8
    But that doesn't really make it asynchronous -- doesn't it just delay the blocking until the next event loop. If the goal is parse a large JSON object without blocking, I don't think this helps. – Mark Oct 09 '17 at 15:31
  • 1
    @NgendLio node runs async code that is blocking, so it won't actually be async because `stringify` is a "all-in-memory algorithm", it does not do any system calls, it never blocks the event loop. You can use processes and message passing if you really need it to be done async or explain the goal and ther emight be easier solutions (later I'll edit my response with true async) – Alex Santos Oct 09 '17 at 15:31
  • @Mark_M I have added an example on how to achieve "asynchronousity" with message passing. This can be applied to pretty much anything. Note that it's just an example to help thinking of different ways, it can be improved and it can be done other ways. – Alex Santos Oct 09 '17 at 16:02
  • Wrong answer. Not asynchronous (and potentially dangerous if someone thinks it is!) as others said as well – Pouria Moosavi Feb 19 '23 at 11:52
0

Check this out, another npm package-

async-json is a library that provides an asynchronous version of the standard JSON.stringify.

Install-

npm install async-json

Example-

var asyncJSON = require('async-json');

asyncJSON.stringify({ some: "data" }, function (err, jsonValue) {
    if (err) {
        throw err;
    }

    jsonValue === '{"some":"data"}';
});

Note-Didn't test it, you need to manually check it's dependency and required packages.

Utkarsh Dubey
  • 703
  • 11
  • 31
0

By asynchronous I assume you actually mean non-blocking asynchronous - i.e., if you have a large (megabytes large) JSON string, and you stringify, you don't want your web server to hard freeze and block newly incoming web requests for 500+ milliseconds while it processes the object.

Option 1

The generic answer is to iterate through your object piece by piece, and to then call setImmedate whenever a threshold is reached. This then allows other functions in the event queue to run for a bit.

For JSON (de)serialization, the yieldable-json library does this very well. It does however drastically sacrifice JSON processing time (which is somewhat intentional).

Usage example from the yieldable-json readme:

const yj = require('yieldable-json')
yj.stringifyAsync({key:"value"}, (err, data) => {
  if (!err)
    console.log(data)
})

Option 2

If processing speed is extremely important (such as with real-time data), you may want to consider spawning multiple Node threads instead. I've used used the PM2 Process Manager with great success, although initial setup was quite daunting. Once it works however, the final result is magic, and does not require modifying your source code, just your package.json file. It acts as a proxy, load balancer, and monitoring tool for Node applications. It's somewhat analogous to Docker swarm, but bare metal, and does not require a special client on the server.

aggregate1166877
  • 2,196
  • 23
  • 38