0

I am making a call to Udemy API. To make simultaneous calls, I am using a loop. By doing so, I am automatically incrementing the page numbers and trying to fetch the data from each page and store it into an array so that I can write all the data into a single file in json format. But all I am getting is an empty array. How do I access the value returned by the promise and store into the doc.table array?

My code:

const fetch=require("node-fetch");
const fs=require("fs");
let doc={};
doc.table=[];

for(let i=1;i<=10;i++){

fetch('https://www.udemy.com/api-2.0/courses/ page='+i+'&page_size=10&client_id=${client_id}&client_secret=${client_secret},{
      method:'GET',
      body:null,
      headers:{authorization: ${auth_code}}
      })
      .then(res=>res.json())
      .then(json=>doc.table.push(json))
};


fs.writeFile("UDEMY.json",JSON.stringify(doc),function(err){
    if(err) throw err;
    console.log("Complete");
});
Nishanth B.S
  • 5
  • 1
  • 3
  • 1
    since promises are asynchronous, you'll need to wait for them to resolve before using any result - the simplest way is to use async/await, and await the `fetch` - of course, the code will need to be in a `async function` ... or simply inside `(async () => { ... all the code from for loop to all the writefile code ...})();` – Jaromanda X May 10 '20 at 07:01
  • two things of note in the code you posted ... you seem to want to use template string, but don't do it right, and no need for `;` in the close `}` of a for loop – Jaromanda X May 10 '20 at 07:06

3 Answers3

3

I'd suggest using await so that your for loop will be paused for each iteration:

const fetch = require("node-fetch");
const fsp = require("fs").promises;

let doc = { table: []};

async function run() {
    for (let i = 1; i <= 10; i++) {

        let data = await fetch(`https://www.udemy.com/api-2.0/courses?page=${i}&page_size=10&client_id=${client_id}&client_secret=${client_secret}`,{
              method:'GET',
              body:null,
              headers:{authorization: auth_code}
        }).then(res=>res.json());

        doc.table.push(data);
    }

    await fsp.writeFile("UDEMY.json",JSON.stringify(doc));
    console.log("done");
}

run().catch(err => {
    console.log(err);
});

The other possibility is to run all the requests in parallel and use Promise.all() to know when they are all done. They key to both solutions is to use the promise that fetch() returns to control knowing when things are done.

If you really want to run them in parallel and you're sure that your target host will allow it, you can do this:

const fetch = require("node-fetch");
const fsp = require("fs").promises;

let doc = { table: []};

function run() {
    let promises = [];
    for (let i = 1; i <= 10; i++) {

        promises.push(fetch(`https://www.udemy.com/api-2.0/courses?page=${i}&page_size=10&client_id=${client_id}&client_secret=${client_secret}`},{
              method:'GET',
              body:null,
              headers:{authorization: ${auth_code}}
        }).then(res=>res.json()));

    }
    return Promise.all(promises).then(data => {
        doc.table = data;
        return fsp.writeFile("UDEMY.json",JSON.stringify(doc));
    });

}

run().then(() => {
    console.log('done');
}).catch(err => {
    console.log(err);
});

And, if you want some level of parallel requests, but want to limit how many are in parallel, you can use mapConcurrent() described here.

jfriend00
  • 683,504
  • 96
  • 985
  • 979
  • some API's would not like too many parallel requests anyway, so, it's probably better to do them in series (I think the OP thought they were running in series anyway) – Jaromanda X May 10 '20 at 07:20
  • @NishanthB.S - As I said, you can run your requests in parallel if you want using `Promise.all()` and have a much cleaner design. Some hosts will not let you start up 100 parallel requests (you'll get rate limited, or crash the host or cause the host to run really, really inefficiently). – jfriend00 May 10 '20 at 07:51
  • @NishanthB.S - I added a parallel option using promises to manage the control flow. – jfriend00 May 10 '20 at 07:54
  • @NishanthB.S - Also notice that both these options offer you clean error handling in one central place. That's another advantage of using good design patterns with promises. – jfriend00 May 10 '20 at 08:09
  • @jfriend00 Yeah got it!. In your earlier code using async await, although there was delay in execution time, no data was lost while writing the file. I think this is more important. Thanks for your response! – Nishanth B.S May 10 '20 at 08:12
0

You can try to check for current loop index, and write your file within last Promise fullfillment:

const fetch = require('node-fetch');
const fs = require('fs');

let url;
let counter = 10;
const doc = {
  table: []
};

for (let i = 1; i <= 10; i++) {
  url = `https://www.udemy.com/api-2.0/courses/page=${i}&page_size=10&client_id=${client_id}&client_secret=${client_secret}`;
  fetch(url, {
    method: 'GET',
    body: null,
    headers: {
      authorization: auth_code
    }
  })
  .then(res => res.json())
  .then(json => {
    // next line will not guarantee the order of pages
    // doc.table.push(json);
    // so we can use current loop index and counter
    doc.table[i] = json;
    // when counter is 0 we can write a file 
    if (!--counter) {
      fs.writeFile('UDEMY.json', JSON.stringify(doc), function(err) {
        if (err) {
          throw err;
        }
        console.log("Complete");
      });
    }
  })
};

I also fixed minor errors in your URL's template string...

Andriy
  • 14,781
  • 4
  • 46
  • 50
  • of course, this won't guarantee the order of data in `doc.table` nor will it guarantee that all the data is present, since when i===9, it may take so long that the result of `i===10` comes back before the result of `i===9` - it would be better to do `doc.table[i-1] = json` and also have a counter of results (can't use array length here either) and continue once 10 results have been received – Jaromanda X May 10 '20 at 07:07
  • at the OP code snippet order was not saved anyway, if order is important we can use `doc.table[i] = json;`. I will update my answer, thanks for pointing... – Andriy May 10 '20 at 07:10
  • I think the OP doesn't understand promise chains hence he thought it was all done after the for loop, therefore thought each loop would be done in series, rather than parallel :p – Jaromanda X May 10 '20 at 07:15
  • @jfriend00 Yeah. Will keep that in mind. Thanks for the advice! – Nishanth B.S May 10 '20 at 07:59
  • jfriend00, thanks for your deleted feedback (from 2 hours ago) about antipaterns in my answer. There is nothing wrong in keeping asynchronous `fetch` calls asynchronous, and execute needed callback in asynchronous way only when the condition is satisfied. This way rendering is not blocked. If we convert 10 asynchronous `fetch` calls to synchronous, browser rendering and other scheduled JS operations will be blocked until all 10 calls success or fail. Also `fs.writeFile()` is asynchronous while `await fsp.writeFile()` is synchronous, thus again blocking code execution and rendering – Andriy May 10 '20 at 10:17
0

You can also save the promises in an array and access them once each of them are fulfilled if the order is important

const promises = []
promises.push(...)
Promise.all(promises).then(data -> ....)

data will now be an array containing the results of the individual promises. You can merge or process them as pleased. Be aware that the above function only resolves once all prior promises are resolved.

Kilian
  • 1,540
  • 16
  • 28