11

The following data stream does not trigger the 'end' event. The 'data' event is triggered and I can see every data row logged to the console.

var AWS = require('aws-sdk');
var ogr2ogr = require('ogr2ogr');
var JSONStream = require('JSONStream');

var S3 = new AWS.S3();
var source = S3.getObject({bucket: ..., key: ...}).createReadStream();

var stream = ogr2ogr(source).format("GeoJSON").stream()
  .pipe(JSONStream.parse('features.*'));

stream.on('data', function(data){
  console.log(data);                // Correctly outputs 70 rows of data.
})

stream.on('end', function(){
     console.log('end');            // This code is never executed.
})    

stream.on('error', function(err){
     console.log(err);              // No errors...
})

The process works if I create a write -> read stream after the ogr2ogr transform.

Mono
  • 302
  • 2
  • 6
  • 21

1 Answers1

4

Take a look at the docs: https://nodejs.org/api/stream.html#stream_event_end

Note that the 'end' event will not fire unless the data is completely consumed. This can be done by switching into a flowing mode, or by calling stream.read() repeatedly until you get to the end

Poode
  • 1,692
  • 1
  • 15
  • 20
Andrzej Karpuszonak
  • 8,896
  • 2
  • 38
  • 50
  • Thanks! I'll give a try and switch into flowing mode, but I still don't understand why in some cases the 'end' event is called without switching to flowing mode (when I write the file to disk and start another readableStream before using JSONStream for example). – Mono Mar 04 '16 at 21:24
  • 8
    I know this answer is almost 3 years old, but anyway: The same [documentation page](https://nodejs.org/api/stream.html#stream_two_reading_modes) says that a stream is automatically switched to flowing mode when a `stream.on('data', ...)` event listener is added. – marcvangend Jan 04 '19 at 13:43