0

I'm working on a node project which enables users to upload .csv files that will then be possibly added to a mongo database. I have it working using a different module, but with large files, it will finish reading and redirect the user before all the data has been processed. I think promised-csv will help with this, but I'm having trouble figuring out how to set this up. This is the current setup:

app.post('/import', isLoggedIn, function(req, res) {
    var reader = csv.createCsvFileReader(req.files.spreadsheet.path, {columnsFromHeader:true, 'separator': ','});
    reader.addListener('data', function(data) {
        //do stuff with data here.
    })
    reader.addListener('end', function(){
        fs.unlink(req.files.spreadsheet.path, function(err) {
            res.redirect('/');
        })
    })
});
user3307654
  • 35
  • 10

1 Answers1

0

first you haven't mentioned in what range the CSV files are? From your code I see that the file processing is done in the same request handler as you upload the file. Depending on the file size if its over 5MB, I suggest to use some sort of batch worker to do the file processing. Add additional logging to see what are the timings how long the inserting takes time, does it receive timeout before it finishes.

Take a look at post about good node CSV library stream based NodeJs reading csv file.

Community
  • 1
  • 1
Risto Novik
  • 8,199
  • 9
  • 50
  • 66
  • I think I should clarify better. The file size of the CSV isn't very large, but it does contain a lot of rows about only about 8 columns of information. I would say with test files containing 2,000+ rows is where I've been encountering the issues. – user3307654 Aug 09 '16 at 19:25
  • Anyway, if it's possible to do the processing by row instead of full file chunk it's better for event loop by not blocking it. – Risto Novik Aug 09 '16 at 19:43