0

I am reading a CSV using CSV-parser npm module and have to perform some operation on the data I get from the CSV (for each line).

const readstream = fs.createReadStream('src/working_file.csv');
const stream = readstream.pipe(parser());
  stream.on('data', async data => {
  // data is a JSON object of the row in CSV. 
  // Now i am calling another async function from user using the data in the JSON
  console.log('before calling');
  const writeToFile = await getImage(data.searchKey);
  console.log('after calling');
  // do other stuff
}

async function getImage(searchKey){
// im doing web scraping here using puppeeter
// it has some await calls too
console.log('in getimage');
 const results = await scrapper.run().catch(err => {
 console.error(err);
 process.exit(1);
 });
}

let say my csv has 2 rows then, my output is coming like below

before calling
in getimage
before calling
in getimage
after calling
after calling

but when I am doing this all callings are happening at a time though I used await. If I have 10 rows in the CSV all 10 rows calling the function is happening at the same time. but I want it to happen one by one. Only when the operation with the first row completes then I want the operate the second row.

my problem is all calls are happening at once rather than once by one.

m9m9m
  • 1,655
  • 3
  • 21
  • 41
  • Can you show the code dealing with ten rows? –  Nov 14 '18 at 10:41
  • Node.js works in the same way as other js platforms. See https://stackoverflow.com/questions/14220321/how-do-i-return-the-response-from-an-asynchronous-call – David Lemon Nov 14 '18 at 10:44
  • Possible duplicate of [How do I return the response from an asynchronous call?](https://stackoverflow.com/questions/14220321/how-do-i-return-the-response-from-an-asynchronous-call) – David Lemon Nov 14 '18 at 10:44
  • David, I have explained my problem more clear – m9m9m Nov 14 '18 at 11:28
  • you would need to provide the `getImage` code maybe there is something in there, maybe some `await` is needed – Nikos M. Nov 14 '18 at 12:00
  • yes, there is an "await" there. Thanks for being so specific. – m9m9m Nov 14 '18 at 12:27
  • It looks like the stream's `data` event is fired for each CSV line in close succession; nothing in your code is preventing the 2nd data event from firing before the 1st has finished. –  Nov 14 '18 at 12:36
  • I am calling the function asynchronously. Doesn't it make sure that lines go one by one into it? – m9m9m Nov 15 '18 at 04:44
  • Isn't the issue here that your callback is async but the `stream.on` is not aware of that and is not awaiting your callback making your callback effectively fire-and-forget? – Pawel Nov 21 '18 at 06:30

1 Answers1

-1

Try this code.

var fs = require('fs');
var parse = require('csv-parse');
var async = require('async');

var inputFile='src/working_file.csv';

var parser = parse({delimiter: ','}, function (err, data) {
  async.eachSeries(data, function (line, callback) {
    // do something with the line
    doSomething(line).then(function() {
      // when processing finishes invoke the callback to move to the next one
      callback();
    });
  })
});
fs.createReadStream(inputFile).pipe(parser);

You can also use fast-csv

Sayed Mohd Ali
  • 2,156
  • 3
  • 12
  • 28