I'm finally trying to get to grips with Promises, one of the newer and more mysterious corners of JavaScript. The following is a problem I came across in a technical test, and which seems to get to the heart of what I don't understand about Promises.
The problem is as follows:
- You have a predefined class called
Service
, with the following methods:generate()
- which generates a random number between 0 and 100.guess(yourGuess)
- which takes a given guess for the random number, and returns a Promise. This Promise, in turn will do one of two things within 100 ms:- If the guess is correct, the Promise will be resolved, with the guess corresponding to that Promise as its first argument.
- If the guess is incorrect, the Promise will be rejected.
submit(yourGuess)
- which submits a guess you believe is correct.
- You have to write an asynchronous function,
main()
, which will:- Generate a random number using a
Service
object. - Submit a correct guess to the same object within 400 ms.
- Any rejected Promises have to be caught.
- Generate a random number using a
This is my code:
const MAX_NUMBER = 100;
async function main()
{
const service = new Service();
let guessInPromise;
service.generate();
for(let i = 0; i <= MAX_NUMBER; i++)
{
service.guess(i)
.then(guessInPromise => service.submit(guessInPromise));
.catch(err => console.log(err));
}
return service;
}
Would my code get the job done? Is there anything I've obviously misunderstood about Promises and asynchronous functions?