0

I'm running into a buffer overflow when I try to pull image data from mongoDB and try to convert it into base64. I know that storing images in mongo is not optimal, but I would like to do it, just to do it.

Here's the error that I'm receiving:

'start-server' errored after 6.28 s
[22:21:05] Error: stdout maxBuffer exceeded
    at Socket.<anonymous> (child_process.js:255:14)
    at emitOne (events.js:90:13)
    at Socket.emit (events.js:182:7)
    at readableAddChunk (_stream_readable.js:153:18)
    at Socket.Readable.push (_stream_readable.js:111:10)
    at Pipe.onread (net.js:529:20)

Here's the route that calls and transforms the data:

 app.get('/api/photos', function (req, res) {
    var photos = Photo.find({}, function(err, photos) {
      if(photos.length != 0){
        var photosData = [];
        for(var i = 0; i < photos.length; i++){
          var thumb = new Buffer(photos[i].img.data).toString('base64');
          photosData.push(thumb);
        }
        res.json({info: 'it worked',
          photos: photosData
        });
      }else{
        res.json({info: 'it worked',
          photos: false
        });
      }
    });
  }) 
Jack Rothrock
  • 407
  • 1
  • 8
  • 21

1 Answers1

0

See this question for maxBuffers: Reading binary data from a child process in Node.js

This actually came from my gulp file, as I had an exec task that was starting my server. Had to add the maxBuffer to that child process as seen below.

gulp.task('start-server', ['build', 'watch'], function (cb) {
    if(!started){
        exec('node dist/server.js', {maxBuffer: 5000*1024}, function (err, stdout, stderr) {
        console.log(stdout);
        console.log(stderr);
        cb(err);
        started = true;
      });
    }
});
Jack Rothrock
  • 407
  • 1
  • 8
  • 21