0

I've written a very simple script to turn a CSV file in a specific format in JSON and now I'm trying to move it into a module that I can use from other scripts. I'm using readline and unfortunately instead of returning the complete JSON object the async completes after everything else so I get the output:

Process stream
undefined
Stream closed

Instead of the desired:

Process stream
Stream closed
[all the converted JSON]

The entire module file is:

var readline = require('readline');

var transform = function(input) {
  console.log('Process stream');
  var json = transform.process(input);
  console.log(json);
  return json;
};

transform.process = function(input) {
  var rl = readline.createInterface({
    input: input,
    terminal: false
  });

  var log = {};
  log.channels = [];
  log.data = [];

  rl
    .on('line', function(line) {
      /* irrelevant code to transform content goes here */
    })
    .on('close', function() {
      console.log('Stream closed');
      return log;
    });
};

module.exports = transform;

A trivial example:

var fmt2json = require('./fmt2json');
var json = fmt2json(process.stdin);
console.log(json);

And execution done as:

node test.js < datafile.csv

I've read a variety of things using async.parallel, wait.for and Promise but I'd like to keep things simple if at all possible. I assume there's a non-javascript way to force the functions to execute synchronously, but I'd also like to know how to do this properly.

  • 1
    No, you absolutely cannot force asynchronous functions to execute synchronously. If you want to do it properly, use promises. – Bergi Mar 09 '16 at 04:25
  • Thanks, when I knew that I should just focus on promises I managed to figure it out. – Geoff Johnson Mar 09 '16 at 23:47

0 Answers0