I've tried to count request - response time programmatically, and ended up with this code:
function fakeRequest(wait) {
return new Promise(resolve => {
setTimeout(() => resolve(wait), wait);
});
}
function calculateTime(fn, params) {
const startTime = new Date().getTime();
fn(...params)
.then(response => {
const endTime = new Date().getTime();
const requestTime = endTime - startTime;
console.log(`
Request should take ${response} ms
Request took ${requestTime} ms
`);
});
}
calculateTime(fakeRequest, [2000]);
In this example, we have hardcoded resolve time (2000 milliseconds), and in my understanding, the final result should be the same - 2 seconds. But when I run this code on my machine, it gives me different results between 2000ms and 2003ms.
I'm trying to figure out, where these 3 milliseconds come from:
The reason is the execution time of
new Date().getTime()
. (but if so, why do we get different results between 2000 and 2003, why it's not the same on every execution?).The reason is an asynchronous nature of the request, even though it has hardcoded resolve time.
Something else.
I'd like to hear your thoughts and find a way to get a real time of response (2 seconds in this case).