0

I've tried to count request - response time programmatically, and ended up with this code:

function fakeRequest(wait) {
  return new Promise(resolve => {
    setTimeout(() => resolve(wait), wait);
  });
}

function calculateTime(fn, params) {
  const startTime = new Date().getTime();
  fn(...params)
    .then(response => {
      const endTime = new Date().getTime();
      const requestTime = endTime - startTime;
      console.log(`
        Request should take ${response} ms
        Request took ${requestTime} ms
      `);
    });
}

calculateTime(fakeRequest, [2000]);

In this example, we have hardcoded resolve time (2000 milliseconds), and in my understanding, the final result should be the same - 2 seconds. But when I run this code on my machine, it gives me different results between 2000ms and 2003ms.

I'm trying to figure out, where these 3 milliseconds come from:

  1. The reason is the execution time of new Date().getTime(). (but if so, why do we get different results between 2000 and 2003, why it's not the same on every execution?).

  2. The reason is an asynchronous nature of the request, even though it has hardcoded resolve time.

  3. Something else.

I'd like to hear your thoughts and find a way to get a real time of response (2 seconds in this case).

P.S.
  • 15,970
  • 14
  • 62
  • 86
  • 1
    `setTimeout` doesn't (and cannot) provide a guarantee that the callback is executed exactly after the provided delay. https://developer.mozilla.org/en-US/docs/Web/API/WindowOrWorkerGlobalScope/setTimeout#Reasons_for_delays_longer_than_specified – Felix Kling Dec 17 '18 at 20:46
  • @FelixKling, hm, it's a little unexpected for me. So you mean, that this method counts request - response time correctly, and the issue is just in `setTimeout` usage? – P.S. Dec 17 '18 at 20:49
  • 1
    Most likely. If you run this a couple of times you get different values. – Felix Kling Dec 17 '18 at 20:50
  • 1
    This might help: https://medium.com/front-end-weekly/javascript-event-loop-explained-4cd26af121d4 – ic3b3rg Dec 17 '18 at 20:50
  • Got it, thank you, guys. Deeping into the article... – P.S. Dec 17 '18 at 20:52
  • 1
    Related: check the first three links in [this answer](https://stackoverflow.com/a/29972322/1048572) – Bergi Dec 17 '18 at 22:03
  • @Bergi, thank you, we can never have too many links – P.S. Dec 17 '18 at 23:20

2 Answers2

1

While Felix is right in that setTimeout can't guarantee the exact callback time, there are some things to note with your code. You aren't calculating the end time as early as possible (after the resolve). My test get slightly closer to the desired time below. My point is that even if your setTimeout was exact, I don't think your log would be correct anyway.

var closeEnd

function fakeRequest(wait) {
  return new Promise(resolve => {
    setTimeout(() => {
      closeEnd = performance.now()
      resolve(wait)
    }, wait);
  })  
}

function calculateTime(fn, params) {
  const startTime = performance.now()
  console.log(startTime)
  fn(...params)
    .then(response => {      
      const requestTime = closeEnd - startTime;      
      console.log(`
        Request should take ${response} ms
        Request took ${requestTime} ms
      `);
    });
}

calculateTime(fakeRequest, [2000]);
Matt Way
  • 32,319
  • 10
  • 79
  • 85
1

If you replace setTimeout(resolve, wait, wait) with resolve(wait) you will get ~5ms. That is probably due to two things:

1) Date.now() returns a non accurate time.

2) Promises are always resolved asynchronously, so there is a small delay until the next engine tick.

So even if setTimeout would be accurate (or if you are not mocking the request) you still wouldn't get an accurate result. And there is no way to. And actually I see no reason why that millisecond would matter.

Jonas Wilms
  • 132,000
  • 20
  • 149
  • 151
  • Actually, it doesn't really matter, I just want to understand why this happens, thanks a lot for clarification – P.S. Dec 17 '18 at 21:27