Please consider the following code:
const race = () => Promise.any([
// first promise, rejects in 100
new Promise((resolve, reject) => {
setTimeout(reject, 100, 1);
})
.then(r => (console.log('one resolved with', r), r))
.catch(err => (console.warn('one rejected with', err), err)),
// second promise, resolves in 200
new Promise((resolve, reject) => {
setTimeout(resolve, 200, 2);
})
.then(r => (console.log('two resolved with', r), r))
.catch(err => (console.warn('two rejected with', err), err))
])
.then(r => console.log('race resolved with', r))
.catch(err => console.warn('race rejected with', err))
race()
Note For brevity, I used comma expressions. If you prefer a more traditional syntax, here's the long form:
const race = () => Promise.any([
new Promise((resolve, reject) => {
setTimeout(reject, 100, 1);
})
.then(r => {
console.log('one resolved with', r);
return r;
})
.catch(err => {
console.warn('one rejected with', err);
return err;
}),
new Promise((resolve, reject) => {
setTimeout(resolve, 200, 2);
})
.then(r => {
console.log('two resolved with', r);
return r;
})
.catch(err => {
console.warn('two rejected with', err);
return err;
})
])
.then(r => console.log('race resolved with', r))
.catch(err => console.warn('race rejected with', err))
race()
I'm expecting race()
to resolve with 2
, in 200
. Instead, it resolves with the return
value of the first promise's catch
handler (1
), in 100
, (even if it's not explicit - e.g: If my catch wouldn't actually return the error. The .any()
wrapper would still resolve with undefined
, instead of waiting for the first fulfilled promise. Resolve, not even reject!).
What's even funnier is that the expected behavior does happen if I do not specify a catch
handler in the failing promise. Have a look:
const race = () => Promise.any([
new Promise((resolve, reject) => {
setTimeout(reject, 100, 1);
})
.then(r => (console.log('one resolved with', r), r)),
new Promise((resolve, reject) => {
setTimeout(resolve, 200, 2);
})
.then(r => (console.log('two resolved with', r), r))
])
.then(r => console.log('race resolved with', r))
.catch(err => console.warn('race rejected with', err))
race()
Maybe I want to log failures! Why does having some code run on rejection make the parent Promise.any()
resolve with return value of the first failed promise which happens to have a .catch
handler, instead of waiting for the first fulfilled promise, as advertised?
Shouldn't the parent Promise.any()
have the same behavior regardless of whether the catch
callback is defined or not?
Is this a bug? I find it mind boggling.
Update: after receiving the explanation on why this happens, I managed to obtain what I was initially looking for (logging of errors in catch blocks, while still waiting for the first of the other promises to fulfill), by returning Promise.reject(err)
in the catch handler:
const race = () => Promise.any([
// first promise, rejects in 100
new Promise((resolve, reject) => {
setTimeout(reject, 100, 1);
})
.then(r => (console.log('one resolved with', r), r))
.catch(err => (console.warn('one rejected with', err), Promise.reject(err))),
// second promise, resolves in 200
new Promise((resolve, reject) => {
setTimeout(resolve, 200, 2);
})
.then(r => (console.log('two resolved with', r), r))
.catch(err => (console.warn('two rejected with', err), Promise.reject(err)))
])
.then(r => console.log('race resolved with', r))
.catch(err => console.warn('race rejected with', err))
race()