0

I have the following code I am trying to troubleshoot but for the life of me, I cannot figure out why its acting weird. Below, I have a function called runTest which takes in arguments for a JSON array and a response stream provided by router.get using express.

What I am trying to do is that my JSON array could contain hundreds of thousands of objects and I would like to write that back in the response as a file. I want the behavior where let's say chrome starts to download the file and the response is continuously streaming data into it.

function runTest(results, res) {

    function writeToStream(i) {
        for (; i < results.length; i++) {
            if (!res.write(
                    results[i]['person']['name'] + "," +
                    results[i].person.name + "," +
                    results[i].person.age +
                    "/n")) {

                // Wait for it to drain then start writing data from where we left off
                res.once('drain', function() {
                    writeToStream(i + 1);
                });
                return;
            }
            //res.pipe(res)
        }
        res.end();
    }
    writeToStream(0)
}

Note: I am trying to implement the first answer by Mike C. in this discussion. Node: fs write() doesn't write inside loop. Why not?

summerNight
  • 1,446
  • 3
  • 25
  • 52
  • Could you elaborate on 'acting weird'? What exactly is happening? An error perhaps? Or an incorrectly formatted response? Please add that information to the question. On a side note, it seems a bit odd that you're trying to 'stream' the response when you've already got the object `results` in memory. Surely if it's too big to send in one go it's also too big to hold in memory? – skirtle Sep 21 '17 at 01:28
  • Are you trying to convert `results` to a CSV? Shouldn't that be `\n` rather than `/n`? – skirtle Sep 21 '17 at 01:40

0 Answers0