Here's what's going on:
Promise.resolve(1)
creates a promise fulfilled with the value 1
.
.then(x => x + 1)
adds a fulfillment handler to that promise, creating a new promise (.then
, .catch
, and .finally
always create promise). When the fulfillment handler is called, it will take the fulfillment value of the promise it was called on as x
, add 1
to it, and use that to fulfill its promise.
.then(x => { throw x })
adds a fulfillment handler to the promise from Step 2, creating a new promise. When called, the fulfillment handler will take the fulfillment value of the promise it was attached to (1 + 1
is 2
) and use that to throw
a value. This rejects the promise this second .then
created.
.catch(err => console.log(err))
adds a rejection handler to that promise. When the rejection handler is called, it takes the rejection reasons (which we know will be 2
) as err
and logs it. Since the rejection handler doesn't throw an error or return a promise that is rejected or will reject, it converts rejection to fulfillment. In this case, it uses the return value of console.log
to fulfill its promise. That value is undefined
.
...and then the same sort of pattern is repeated again.
Key bits there are:
.then
, .catch
, and .finally
create promises
- If you return a value (explicitly or implicitly) from a fulfillment or rejection handler, and that value isn't a promise that is rejected or will reject, it fulfills the promise
.then
or .catch
created with that value.
- If you throw (or return a promise that is rejected or will reject) in a
.then
or .catch
handler, you reject the promise .then
or .catch
created.
If you're not entirely clear on the terminology around promises ("fulfill," "reject," "resolve" [which I didn't use above, but the code did]), I have a blog post for that. ;-)