4

I get the following error. As I checked, it seems that the json file is too large to run because of insufficient server memory.

How can we solve this problem? There is a thing called Node streaming, it's hard for me to understand. Can you explain it more easily?

server.js

const express = require('express');
const cors = require('cors');
const userJson = require('./user');
const locationJson = require('./location');

const API_PORT = process.env.PORT || 3002;
const app = express();
app.use(cors());
const router = express.Router();

router.get("/getUserData", (req, res) => {
    return res.json(userJson);
});

router.get("/getLocationData", (req, res) => {
    return res.json(locationJson);
});

app.use("/api", router);

app.listen(API_PORT, () => console.log(`LISTENING ON PORT ${API_PORT}`));

error messages

buffer.js:585
      if (encoding === 'utf8') return buf.utf8Slice(start, end);
                                          ^

Error: Cannot create a string longer than 0x3fffffe7 characters
    at stringSlice (buffer.js:585:43)
    at Buffer.toString (buffer.js:655:10)
    at Object.readFileSync (fs.js:392:41)
    at Object.Module._extensions..json (internal/modules/cjs/loader.js:816:22)
    at Module.load (internal/modules/cjs/loader.js:666:32)
    at tryModuleLoad (internal/modules/cjs/loader.js:606:12)
    at Function.Module._load (internal/modules/cjs/loader.js:598:3)
    at Module.require (internal/modules/cjs/loader.js:705:19)
    at require (internal/modules/cjs/helpers.js:14:16)
    at Object.<anonymous> (/Users/k/Desktop/dev/backend/location.js:1:22)
J184937
  • 67
  • 2
  • 8
  • Streams seems to be your only option. What do you not understand ? The documentation has plenty examples. Basically, you create a stream, then pipe it to whatever process function you want. This function will receive the data in chunks. – Seblor Jun 17 '19 at 10:01
  • Does using stream mean the same thing as splitting the data? For example, is it more efficient than artificially allocating 10 units of memory? Using streams is faster than allocating additional memory? – J184937 Jun 17 '19 at 10:07
  • I don't know exactly how memory managment works in Node, but you could take a look at this question : https://stackoverflow.com/questions/42896447/parse-large-json-file-in-nodejs-and-handle-each-object-independently – Seblor Jun 17 '19 at 11:54
  • `require('./user')` and `require('./location')` are your problem lines (assuming those really are too large). Rather than just `require` them in, you'll need to create a [**Readable Stream**](https://nodejs.org/api/stream.html#stream_readable_streams), and only after the data has been loaded, instantiate your express routes. _EDIT_ Also looks like there is an npm module [JSONStream](https://www.npmjs.com/package/JSONStream) that might be helpful here. – romellem Jun 17 '19 at 14:43
  • Selblor & romellem thank you so much :) – J184937 Jun 17 '19 at 23:45

1 Answers1

2

You can stream the file and pipe() it directly into the res:

const fs = require('fs');
const path = require('path');

const getBigJson = (req, res, next) => {
    const pathToJson = path.join(__dirname, 'path/to/file');
    const jsonStream = fs.createReadStream(pathToJson);
    res.set({'Content-Type': 'application/json'});
    jsonStream.pipe(res);
};

It should send it chunk by chunk without exhausting your memory.

See https://github.com/substack/stream-handbook for a good explanation of streams.

OLIVIER
  • 858
  • 9
  • 18