4

I am parsing extremely large CSV files (~37gbs). I am using fs.createReadStream and csv-parser. I break them into 5000 rows and then insert those into a mongo db. This error happens even when the mongo portion is commented out, however.

Here's the function to parse the file:

function parseCsv(fileName: string, db: Db): Promise<any[]> {
    let parsedData: any[] = [];
    let counter = 0;
    return new Promise((resolve, reject) => {
        const stream = fs.createReadStream(fileName)
            .pipe(csvParser())
            .on('data', async (row) => {
                const data = parseData(row);
                parsedData.push(data);

                if (parsedData.length > 5000) {
                    stream.pause();
                    // insert to mongo
                    counter++;
                    console.log('counter - ', counter, parsedData[0].personfirstname, parsedData[23].personfirstname);
                    parsedData = [];

                    // try {
                    //  await db.collection('people').insertMany(parsedData, { ordered: false });
                    //  parsedData = [];
                    // }
                    // catch (e) {
                    //  console.log('error happened', e, parsedData.length);
                    //  process.exit();
                    // }

                    stream.resume();
                }
            })
            .on('error', (error) => {
                console.error('There was an error reading the csv file', error);
            })
            .on('end', () => {
                console.log('CSV file successfully processed');
                resolve()
            });
    });
}

And here's the function to parse the data. It's kind of messy with all the values in one cell separated by pipes so I just split on them:

function parseData(data: any) {
    let values = '';
    for (var key in data) {
        if (data.hasOwnProperty(key)) {
            values += data[key];
        }
    }
    const splitValues = values.split('|');
    let parsedData: any = {};

    // Remove deep reference
    parsedData = JSON.parse(JSON.stringify(template));
    let keyCounter = 0;
    for (let key in parsedData) {
        if (parsedData.hasOwnProperty(key)) {
            try {
                parsedData[key] = splitValues[keyCounter].trim();
            }
            catch (e) {
                console.log('error probably trimming', key, splitValues[keyCounter], splitValues, data);
                throw '';
            }
            keyCounter++;
        }
    }

    const now = new Date();
    parsedData.createdAt = now;
    parsedData.updatedAt = now;
    return parsedData;
}

It'll parse fine (until ~2 million rows) and then hang. Finally after leaving it hanging all night, I checked in the morning and see the following error:

buffer.js:580
      if (encoding === 'utf-8') return buf.utf8Slice(start, end);
                                           ^

Error: Cannot create a string longer than 0x3fffffe7 characters
    at stringSlice (buffer.js:580:44)
    at Buffer.toString (buffer.js:643:10)
    at CsvParser.parseValue (C:\js_scripts\csv-worker\node_modules\csv-parser\index.js:175:19)
    at CsvParser.parseCell (C:\js_scripts\csv-worker\node_modules\csv-parser\index.js:86:17)
    at CsvParser.parseLine (C:\js_scripts\csv-worker\node_modules\csv-parser\index.js:142:24)
    at CsvParser._flush (C:\js_scripts\csv-worker\node_modules\csv-parser\index.js:196:10)
    at CsvParser.prefinish (_stream_transform.js:140:10)
    at CsvParser.emit (events.js:200:13)
    at prefinish (_stream_writable.js:633:14)
    at finishMaybe (_stream_writable.js:641:5) {
  code: 'ERR_STRING_TOO_LONG'
}

Shouldn't createReadStream ensure this doesn't happen? There are 415 columns in this each row. Is it possible that a single row is too big? It always stops at the same place so this seems likely. Since the files are so big I don't have a way to oepn them. If so, how can I detect this and just skip this line or handle it a different way?

Aarmora
  • 1,143
  • 1
  • 13
  • 26

0 Answers0