I need to execute an operation that needs to be executed relatively fast (let's say 10 times per second. It should be fast enough, but I can sacrifice speed if there are issues.) This is an ajax request, so potentially I do not know how much time it takes - it could even take seconds if network is bad.
The usual:
setInterval(() => operation(), 100);
Will not work here, because if the network is bad and my operation takes more than 100 ms
, It might be scheduled one after another, occupying JS engine time (please correct me if I'm wrong)
The other possible solution is to recursively run it:
function execute() {
operation();
setTimeout(execute, 100);
}
This means that there will be 100 ms between the calls to operation()
, which is OK for me. The problem with this is that I'm afraid that it will fail at some point because of stack overflow. Consider this code:
i = 0;
function test() { if (i % 1000 == 0) console.log(i); i++; test(); }
If I run it my console, this fails in around 12000 calls. if I add setTimeout
in the end, this would mean 12000 / 10 / 60 = 20 minutes, potentially ruining the user experience.
Are there any simple ways how to do this and be sure it can run for days?