Objective
Find out the true behavior of setTimeout(fn, X)
when X-> 0.
Background
I am developing a tool that makes QPS (Queries Per Second) tests, using setTimeout()
. I was recently surprised when I made a test of 1000 QPS that took roughly 5 seconds to execute when it should have taken 1 (not counting any other external factors).
The tests works fine for lower values of QPS, bellow 100 for example.
While investigating, I found some articles explaining that when making setTimeout(fn, X)
calls, when X tends to 0, if X is too small, it will get trimmed to a higher value.
An examples of this can be seen in this question from 2011, where the old spec specifies that if setTimeout(fn, X)
where X <= 3, X will be automatically set to 4.
This means that the minimum amount of time I can ever hope to wait, between two executions of setTimeout()
is 4 milliseconds. If I have setTimeout(fn, 1)
, it will be converted to setTimeout(fn, 4)
.
Questions
Every article I see says something different, and in the question I previously posted, different answers say different things as well. The overall all conclusion seems to be "there is no conclusion, as the behavior is highly inconsistent".
Coming back to node.js, and since the questions I pointed is quite old, I would like an update on the following :
- One of the answers says that the minimum value of X is 1. Is this still accurate?
- How does
setTimeout(fn, X)
when X -> 0 work?
Conclusion
I would like more information regarding setTimeout()
so I can build my code around it. Links to documentation and dates of the articles or answers found will be highly appreciated.
Thanks for the help!