0

I am new to JavaScript and the npm world. I try to upload some data to my REST service via a REST post call. These data I fetch from a csv file. So far so good. On each fetched line I convert the data (for my needs) and call the REST API for uploading those. Since I have many line (approx. 700) the API gets called quite often consecutively. After some calls (guess 500 or so) I get an Socket error

events.js:136
      throw er; // Unhandled 'error' event
      ^

Error: connect ECONNRESET 127.0.0.1:3000
    at Object._errnoException (util.js:999:13)
    at _exceptionWithHostPort (util.js:1020:20)
    at TCPConnectWrap.afterConnect [as oncomplete] (net.js:1207:14)

I guess this is because I call the REST API to often. What I don't understand is: How should I make the call synchronously in order to avoid so many connections? Or should't I? What would be the proper solution in JS for this?

I have tried with Promises and so on but all this didn't helped but moved the issue some function calls priorly...

This is my code:

readCsv()

function readCsv() {
    var csvFile = csvFiles.pop()
    if (csvFile) {
        csv({ delimiter: ";" }).fromFile(csvFile).on('json', async (csvRow) => {
            if (/.*\(NX\)|.*\(NI\)|.*\(NA\)|.*\(WE\)|.*\(RA\)|.*\(MX\)/.test(csvRow["Produkt"])) {
                var data = await addCallLog(
                    csvRow["Datum"],
                    csvRow["Zeit"],
                    csvRow["Menge-Zeit"],
                    csvRow["Zielrufnummer"],
                    csvRow["Produkt"]);
            }
        }).on('done', (error) => {
            //console.log('end')
            readCsv()
        })
    } else {

    }
}

function addCallLog(date, time, duration, number, product) {
    return new Promise(resolve => {
        args.data = { number: number, name: "", timestamp: getTimestamp(date, time), duration: getDuration(duration), type: "OUTGOING" }
        client.methods.addCallLog(args, (data, response) => {
            // client.methods.getCallLog((data, response) => {
            //     console.log(data)
            // })
            //console.log("addCallLog resolve")
            resolve(data)
        })
    })
}

As you can see I had the same issue with reading more than one csv files in parallel. I solved this by calling recursively the readCsv function and pop the next file after the other when the file read was done.

burix
  • 13
  • 1
  • 2
  • Possible duplicate of [How to make 'http requests' synchronous in Node js](https://stackoverflow.com/questions/37085510/how-to-make-http-requests-synchronous-in-node-js) – msorce Feb 13 '18 at 00:58
  • A problem here is that `await addCallLog()` won't keep the next `json` events from being generated so you will end with a zillion requests in flight at the same time and apparently you have so many that you run out of resources. – jfriend00 Feb 13 '18 at 02:04
  • One possibility here is to collect all the json data into an array and then use a `for` loop to iterate the array where your `await addCallLog()` will work to serialize the requests. – jfriend00 Feb 13 '18 at 02:08
  • What csv module are you using? – jfriend00 Feb 13 '18 at 05:05
  • I was using csvtojson – burix Feb 13 '18 at 22:55

2 Answers2

0

You can't call things synchronously. But, you can sequence the async REST calls which is what I presume you mean.

A problem here is that await addCallLog() won't keep the next json events from being generated so you will end with a zillion requests in flight at the same time and apparently you have so many that you run out of resources.

One way around that is to collect the rows you want into an array and then use a regular for loop to iterate that array and you can use await sucessfully in the for loop. Here's what that would look like:

readCsv()

function readCsv() {
    var csvFile = csvFiles.pop()
    if (csvFile) {
        let rows = [];
        csv({ delimiter: ";" }).fromFile(csvFile).on('json', (csvRow) => {
            if (/.*\(NX\)|.*\(NI\)|.*\(NA\)|.*\(WE\)|.*\(RA\)|.*\(MX\)/.test(csvRow["Produkt"])) {
                rows.push(csvRow);
            }
        }).on('done', async (error) => {
            for (let csvRow of rows) {
                var data = await addCallLog(
                    csvRow["Datum"],
                    csvRow["Zeit"],
                    csvRow["Menge-Zeit"],
                    csvRow["Zielrufnummer"],
                    csvRow["Produkt"]
                );
            }
            readCsv();
        })
    } else {

    }
}

function addCallLog(date, time, duration, number, product) {
    return new Promise(resolve => {
        args.data = { number: number, name: "", timestamp: getTimestamp(date, time), duration: getDuration(duration), type: "OUTGOING" }
        client.methods.addCallLog(args, (data, response) => {
            // client.methods.getCallLog((data, response) => {
            //     console.log(data)
            // })
            //console.log("addCallLog resolve")
            resolve(data)
        })
    })
}

Your coding appears to be missing error handling. The client.methods.addCallLog() needs a way to communicate back an error.

You probably also need a error event handler for the csv iterator.

jfriend00
  • 683,504
  • 96
  • 985
  • 979
  • Why the downvote? Can't improve my answer if you don't at least comment. – jfriend00 Feb 13 '18 at 20:47
  • That solved my problem! Many thanks. Using the buffer was the hint I was the relevant one for me. I have changed my code also in terms of not using await but the "then" callback of the promise. Now its working like a charm. Btw. Actually I didn't vote down. At least I voted finally up :) – burix Feb 13 '18 at 22:52
  • @burix - Glad your problem is fixed, but I don't know what you mean by "switched to .then()` because it's the `await` inside the `for` loop that makes things run one at a time to prevent running too many requests at a time. – jfriend00 Feb 13 '18 at 23:18
  • I used the then function callback of Promises instead of async/await. It is now like a recursive call. But the most important part anyway was to introduce a buffer where I first filled that with data and afterwards. I post my final solution in an answer below... – burix Feb 14 '18 at 22:05
0

After filling the buffer in a prev. function I check that buffer for data and upload those one by one using the "then" callback of the promise

var callLogBuffer = []

checkForUpload()

function checkForUpload() {
    console.log("checkForUpload")
    if (callLogBuffer.length > 0) {
        addCallLog(callLogBuffer.pop()).then((data) => {
            checkForUpload()
        })
    }
}


function addCallLog(callLog) {
    return new Promise(resolve => {
        args.data = { number: callLog.number, name: "", timestamp: getTimestamp(callLog.date, callLog.time), duration: getDuration(callLog.duration), type: "OUTGOING" }
        client.methods.addCallLog(args, (data, response) => {
            // client.methods.getCallLog((data, response) => {
            //     console.log(data)
            // })
            //console.log("addCallLog resolve")
            resolve(data)
        })
    })
}
burix
  • 13
  • 1
  • 2