0

I have started building PHP and Node Js app for my real time sports score updates. I have my php and node server working fine with node modules express, socket io and zeromq. I have a large data that I receive from API where in the php file I package and send the json data to node server (through zeromq), the data is then received in the node js server file from where it's sent to the client side. Now, the set up is totally working fine for small set of data. But when it's large file, the node server fails to process further with error listed below.

This is the error i get while trying to send to the client through socket io in node server

node: ../node_modules/nan/nan.h:822: Nan::MaybeLocal Nan::NewBuffer(char*, size_t, node::Buffer::FreeCallback, void*): Assertion `length <= imp::kMaxLength && "too large buffer"' failed. Aborted (core dumped)

This is the main node_socket.js

  var express = require('express'); 
  var app = express();
  var fs = require('fs');

  var options = {
  key: fs.readFileSync('/etc/letsencrypt/live/example.com/privkey.pem'),
  cert: fs.readFileSync('/etc/letsencrypt/live/example.com/fullchain.pem'),
  ca: fs.readFileSync('/etc/letsencrypt/live/example.com/chain.pem')
 };

 var https = require('https').Server(options, app);

  var zmq = require('zeromq')
 , sock = zmq.socket('pull');
sock.bind('tcp://10.150.0.6:1111');

var io = require('socket.io')(https); 

io.on('connection', function(socket){ 

socket.on('disconnect',function(){
    console.log("client disconnected");
})  

sock.on('message',function(msg){    
 console.log('work: %s', msg.toString());   
 socket.emit('latest_score',msg.toString());    

});     

 });

 https.listen(3333);
 sock.on('connect', function(fd, ep) {console.log('connect, endpoint:', ep);});

 console.log('App connected to port 3333');

Please note the app works fine with small data but just won't able to handle large json data being sent from php file. I tried with few different things since few days but to no avail. I also hired few node js developer from fiverr.com but they couldn't solve the problem as well. I am hoping someone here will guide me in right direction.

Matt
  • 41
  • 1
  • 6

1 Answers1

1

From the nodejs documentation on buffer (https://nodejs.org/api/buffer.html)

buffer.constants.MAX_LENGTH#
Added in: v8.2.0
<integer> The largest size allowed for a single Buffer instance.
On 32-bit architectures, this value is (2^30)-1 (~1GB). On 64-bit architectures, this value is (2^31)-1 (~2GB).

This value is also available as buffer.kMaxLength.

So assuming you're on a 64-bit system you seem to send more than 2GB of data at once. Assuming you have a big JSON Array of data the easiest way would be to split the array up into chunks and send them through the socket individually.

So here:

sock.on('message',function(msg){    
   console.log('work: %s', msg.toString());   
  socket.emit('latest_score',msg.toString());    
}); 

you need to JSON.parse() the data, split it up, JSON.stringify() the data and send it individually through the socket.

See here how to split the array: Split array into chunks

Update: (because of comment) If you absolutely can't split the data you could store it in php (in a database or a file) and build a REST api to query the data. Then just send the id of the file/row through ZeroMQ and NodeJs. The client then has to call the REST api to get the actual data.

Fels
  • 1,214
  • 2
  • 13
  • 27
  • Appreciate your answer. However, assuming that I am sending one single large array instead of multiple arrays, what could be the possible solution? I am saying this because I am sending a large file of page that also contains html elements to the json – Matt Jun 28 '19 at 10:51
  • I updated my answer. It would really help if you could show us some example data. – Fels Jun 28 '19 at 11:14
  • you can check the json data here http://run.sportsfier.com/json-data.php .. It comes all together in one json object.. I have limited options on php side to sent them is multiple arrays. I am not sure how should i be dealing with splitting one json object to multiple arrays based on characters count. – Matt Jun 28 '19 at 12:15