While playing around with Promises to understand how they work, I noticed something I can't explain.
Given this example:
var A = function () {
return Promise.resolve();
};
var B = function () {
return Promise.reject();
};
var c = A();
var d = B();
c.then(
function () { console.log('A success'); },
function () { console.log('A fail'); }
);
d.then(
function () { console.log('B success'); },
function () { console.log('B fail'); }
);
Promise.all([c, d]).then(
function () { console.log('all success'); },
function () { console.log('all fail'); }
);
First the single resolve/reject callbacks fire, followed by the reject callback of Promise.all
. This is expected because B
rejects the Promise.
But when written like the, the resolve callback of Promise.all
fires:
var A = function () {
return Promise.resolve();
};
var B = function () {
return Promise.reject();
};
var c = A().then(
function () { console.log('A success'); },
function () { console.log('A fail'); }
);
var d = B().then(
function () { console.log('B success'); },
function () { console.log('B fail'); }
);
Promise.all([c, d]).then(
function () { console.log('all success'); },
function () { console.log('all fail'); }
);
This is unexpected since one of the two Promises is rejected, so the Promise returned by all
should be rejected too.
What is happening here – does is have something to do with return values? Do I need to return a new Promise somewhere?