4

I have been getting this error FATAL ERROR: JS Allocation failed - process out of memory and I have pinpointed it to be the problem that I am sending really really large json object to res.json (or JSON.stringify) To give you some context, I am basically sending around 30,000 config files (each config file has around 10,000 lines) as one json object

My question is, is there a way to send such a huge json object or is there a better way to stream it (like using socket.io?)

I am using: node v0.10.33, express@4.10.2

UPDATE: Sample code

var app = express();

app.route('/events')
.get(function(req, res, next) {
  var configdata = [{config:<10,000 lines of config>}, ... 10,000 configs]
  res.json(configdata); // The out of memory error comes here
})
amulllb
  • 3,036
  • 7
  • 50
  • 87
  • Can you give a little code sample? – Evan Hahn May 12 '15 at 21:44
  • i have added code sample code – amulllb May 13 '15 at 16:19
  • 1
    Surely the answer is NOT to send 30,000 config files at a time? This is a code smell. Even 1 json download with 10000 lines is alot. – BenjaminPaul May 13 '15 at 16:21
  • yes, i agree... it was poor code design. i have fixed it with socket.io to stream config file rather than send it all at once – amulllb May 13 '15 at 17:29
  • also, since i am storing each config file as a db entry, i cannot stream line by line. plus reading this article, i see that 10k lines in one config file is ok to be read all at once http://josh.zeigler.us/technology/web-development/how-big-is-too-big-for-json/ – amulllb May 13 '15 at 17:31

3 Answers3

4

After a lot of try, I finally decided to go with socket.io to send each config file at a time rather than all config files at once. This solved the problem of out of memory which was crashing my server. thanks for all your help

amulllb
  • 3,036
  • 7
  • 50
  • 87
2

Try to use streams. What you need is a readable stream that produces data on demand. I'll write simplified code here:

var Readable = require('stream').Readable;
var rs = Readable();

rs._read = function () {
    // assuming 10000 lines of config fits in memory
    rs.push({config:<10,000 lines of config>);
};

rs.pipe(res);
Magomogo
  • 954
  • 4
  • 13
0

You can try increasing the memory node has available with the --max_old_space_size flag on the command line.

There may be a more elegant solution. My first reaction was to suggest using res.json() with a Buffer object rather than trying to send the entire object all in one shot, but then I realize that whatever is converting to JSON will probably want to use the entire object all at once anyway. So you will run out of memory even though you are switching to a stream. Or at least that's what I would expect.

Trott
  • 66,479
  • 23
  • 173
  • 212
  • well, i can surely stringify one config file at a time and that is ok. the problem comes, when i try stringifying 10,000 config files (each config file having 10,000 likes). Can you point me to an example of using buffer object. Otherwise, I think i will go with socket.io streaming method... – amulllb May 13 '15 at 16:18