0

I have the following object that I want to split in increments of 100 and save each 100 into separate files using Node.js. However, I am not able to find any way to split an array that has thousands of records by 100 and save into a separate file.

Question:

How can I split an object's array property by increments of 100 into separate files?

What I tried

var fs = require('fs');
var filePath = 'myPath.json';

fs.readFile(filePath, (err, data) => {
  if(err) {
    throw err;
  }
  let jsonObject = JSON.parse(data);
  var numOfRecords = jsonObject.myRecords.length;
  var numOfFilesRequired = Math.ceil(numOfRecords/100);

  for(var num = 0; num < numOfFilesRequired; num++) {
    var splitItems = jsonObject.slice(0, numOfRecords);
    var newJsonObject = [];
    newJsonObject.push({ myRecords : splitItems });
    // this add additional [] to the beginning and end which I don't want
    fs.writeFileSync('output.json', JSON.stringify(newJsonObject);
  }
});

Sample data

{ myRecords : [
  { /* record 1*/},
  { /* record 2*/},
  ...
  { /* record 1000*/}
  ]
}

Expected output

  • 10 files where 100 records are stored
  • myfile-1.json contains records 1-100
  • myfile-2.json contains records 101-200
  • myfile-3.json contains records 201-300

Sample of the expected output file name: myfile-1.json

{
 myRecords : [
    { /* record 1*/},
    { /* record 2*/},
    ...
    { /* record 100*/}
 ]
}
usernameabc
  • 662
  • 1
  • 10
  • 30
  • i think you need to take an the json and parse it to object and after work with Object.entries to change the object to array and loop over it after slice the array to remove the last 100 or in the loop when you add to the file remove it from the array and after work with recursive function to repeat this until the array is empty – Hamza Miloud Amar May 12 '21 at 21:56
  • i think this solution is not optimized if you have a file that have a lot of line you that can blow your ram you need to work with stream api in node – Hamza Miloud Amar May 12 '21 at 21:57
  • If the file is huge, then I'd try the library: https://www.npmjs.com/package/stream-json If it's not going to have more than a 1000 entries, then it's OK to read in the whole file and do the processing after. – Szabolcs Dézsi May 12 '21 at 22:03
  • 1
    You appear to be confused between [JSON and object literal notation](https://stackoverflow.com/questions/2904131/what-is-the-difference-between-json-and-object-literal-notation). What you are showing as expected output is not valid JSON, since keys in JSON must be quoted with double quote characters (`"`). – Heretic Monkey May 12 '21 at 22:21
  • @HereticMonkey I updated the code sample to use `.splice()` but now have the issue of `[` and `]` at the beginning and end of the `jsonObject` which I don't want – usernameabc May 12 '21 at 22:43
  • 1
    You're using `slice`, not `splice`... `numOfRecords` is the total number of records, so your first `slice` gets all of the records. You need to keep track of where you are in the array so that you only pull 100 records at a time. Also, your `newObject` should just be `newObject = { myRecords: splitItems }`, not an array. – Heretic Monkey May 12 '21 at 22:46

1 Answers1

1

Some kind of nested loop should do it for you.

for (let i = 0; i < jsonObject.myRecords.length; i += 100) {
  // Create a file here. If you want a number for the file it will be i / 100.
  for (let j = i; j < i + 100 && j < jsonObject.myRecords.length; j++) {
    // Add the record to the file.
  }
  // Save the file.
}

The steps will be a bit different if you are using streams instead of fs.writeFile type things, but you'll probably still up with two nested loops.

Charlie Bamford
  • 1,268
  • 5
  • 18