1

I wrote the following code:

var csv = require('csv-parser')
var fs = require('fs')

var devices = []
fs.createReadStream('devices.csv')
    .pipe(csv())
    .on('data', function (data) {
        devices.push(data)
    });

console.log(devices)

The line devices.push(data) insert each line in csv file into the global array 'devices'. Unfortunatelty, when I reach the last line in my code (outside the callback), I see that devices is again empty array. Why does it happen, and how can I make it work as I want?

Colin Rauch
  • 515
  • 1
  • 6
  • 17
CrazySynthax
  • 13,662
  • 34
  • 99
  • 183
  • You have it backwards. You reach the last line in the code first, when the array is still empty, and then afterwards when the callback is executed it will get filled with the data. – Bergi Nov 17 '16 at 01:04

1 Answers1

3

The stream handling is asynchronous (you can think of it like it runs in the background with event notifications and the rest of your code continues to have an opportunity to run). When you do .pipe() and .on('data'), you are just starting the stream operation and then setting an event handler that will be called some indeterminate time in the future. The rest of your code continues to run.

Thus your console.log(devices) line of code runs BEFORE the stream parsing. You need to register for an event that signals the end of the stream parsing and then look at the devices array in that event handler.

var csv = require('csv-parser')
var fs = require('fs')

var devices = []
fs.createReadStream('devices.csv')
    .pipe(csv())
    .on('data', function (data) {
        devices.push(data)
    }).on('end', function() {
        // use the devices array in here
        console.log(devices);
    });

node.js stream events are documented here.

jfriend00
  • 683,504
  • 96
  • 985
  • 979