1

I'm writing a then() statement for extracting the json data from an array of responses from fetch(). In the code below queries is an array of promises returned by a series of calls to fetch(). I'm using async/await for the response because otherwise the promises would be returned without resolving (I found a solution in this question).

My first attempt worked properly, when I push into jsonified I obtain an array with the promises as elements:

return Promise.all(queries)
.then(async(responses)=> {
    let jsonified = [];
    for (let res of responses){
        jsonified.push(await(res.json()));
    }
    return jsonified;
}.then(data=> ...

But when I went for refactoring and tried to use Array.reduce() I realised that when I push into the accumulator instead of obtaining an array with a promise as element, acc is assigned to be a promise instead.

.then(responses=> {
    return responses.reduce(async(acc, next) => {
        acc.push(await(next.json()));
        return acc;
    }, [])
})

I can use the first version without any issue and the program works properly, but whats happening inside Array.reduce()? Why pushing a promise into the accumulator returns a promise intead of an array? How could I refactor the code with Array.reduce()?

maja
  • 697
  • 5
  • 18

2 Answers2

3

Although it's not what you've asked, you could avoid the pain of having to use reduce, and just utilise the Promise.all() that you are already using:

return Promise.all(queries.map(q => q.then(res => res.json()))
  .then(data => {...})

It's a much shorter way and less of a headache to read when you come back to it.

Kobe
  • 6,226
  • 1
  • 14
  • 35
2

Have the accumulator's initial value be a Promise that resolves to an empty array, then await the accumulator on each iteration (so that all prior iterations resolve before the current iteration runs)

.then(responses=> {
    return responses.reduce(async (accPromiseFromLastIter, next) => {
       const arr = await accPromiseFromLastIter;
       arr.push(await next.json());
       return arr;
    }, Promise.resolve([]))
})

(That said, your original code is a lot clearer, I'd prefer it over the .reduce version)

Live demo:

const makeProm = num => Promise.resolve(num * 2);

const result = [1, 2, 3].reduce(async(accPromiseFromLastIter, next) => {
  const arr = await accPromiseFromLastIter;
  arr.push(await makeProm(next));
  return arr;
}, Promise.resolve([]));

result.then(console.log);

Unless you have to retrieve all data in serial, consider using Promise.all to call the .json() of each Promise in parallel instead, so that the result is produced more quickly:

return Promise.all(queries)
.then(responses => Promise.all(responses.map(response => response.json())));

If the queries are an array of Responses that were just generated from fetch, it would be even better to chain the .json() call onto the original fetch call instead, eg:

const urls = [ ... ];
const results = await Promise.all(
  urls.map(url => fetch(url).then(res => res.json()))
);

This way, you can consume the responses immediately when they come back, rather than having to wait for all responses to come back before starting to process the first one.

CertainPerformance
  • 356,069
  • 52
  • 309
  • 320
  • I went with the last suggestion, passing the fetch promises as soon as they're individually ready. – maja Dec 24 '19 at 02:35