1

I have a node.js readstream which emits a buffer and using toString() function i convert the buffer to string and after that when i try to convert the string to JSON via JSON.parse() function it throws parse error.

Is there a best way to convert buffer to string and then that string to JSON?

JSON String looks like below,

[{"data1": 1487328824948, "encrypt": false, "version": "1.0.0", "data2": "value2", "data3": "value3", "data4": "value4", "data5": "value5"},{"data1": 148732882448, "encrypt": false, "version": "1.0.0", "data2": "value2", "data3": "value3", "data4": "value4", "data5": "value5"}.........]

Sai
  • 1,790
  • 5
  • 29
  • 51

3 Answers3

4
var buf = Buffer.from(JSON.stringify(obj));
var temp = JSON.parse(buf.toString());
1

I was able to parse the incoming stream using JSONStream package. https://github.com/dominictarr/JSONStream, really helped me in this use case, a nice and handy tool.

Related StackOverflow post - Parse large JSON file in Nodejs

Sai
  • 1,790
  • 5
  • 29
  • 51
0

This seems like the right way, but it seems likely that your readstream isn't finishing reading the input before you call JSON.parse(). Therefore the JSON.parse() call only parses part of your JSON string and then you get the error.

Try making sure the read() finishes - use readSync()?

ahoang18
  • 143
  • 7
  • the readstream that i'm dealing here comes out of a tar.gz file and it continuously streams the JSON like string, as the data comes, in the .on('data') listener i do the buffer to string conversion and then via JSON.parse convert it to object so i can start parsing it. Not sure if readSync to help here and also might slow down the process since the tar.gz size is pretty large. – Sai Dec 21 '17 at 14:29
  • i added the JSON string that i'm dealing with, similar to the one that i posted it keeps coming out of tar.gz – Sai Dec 21 '17 at 14:36