7

I've got a 'script' that does thousands of requests to a specific API. This API will only allow 5 requests per second (and probably it measures differently then me). To make the requests I'm using request-promise framework, and I've superseded the normal request-promise function with this:

const request_promise  = require('request-promise')

function waitRetryPromise() {
  var count = 0 // keeps count of requests
  function rp(options) {
    const timedCall = (resolve) => setTimeout( ()=>resolve(rp(options)),1000) // recursive call
    count += 1
    if (count % 3 == 0) { // recalls after a second on every third request
      return new Promise(timedCall)
    } else {
      return request_promise(options)
    }
  }
  return rp
}

const rp = waitRetryPromise()

Once around 300 requests (give or take) are fired off in short succession, these requests start to interfere with each other. Does anyone have a better solution? I thought the recursive call to this same function would help, and It did but it didn't solve the problem. Maybe there is a pattern to queue requests, and do them a few at a time? A library perhaps?

Thanks!

Jono
  • 3,393
  • 6
  • 33
  • 48
  • 1
    `these requests start to interfere with each other` in what way? – Jaromanda X Oct 12 '17 at 04:22
  • still like to know what you mean by `these requests start to interfere with each other` – Jaromanda X Oct 12 '17 at 05:16
  • so,, no response to a valid question? – Jaromanda X Oct 12 '17 at 11:32
  • Sorry I asked this right before bed. What I meant with the above quote is that when I make some delay some inevitably still start at the same time as some later non delayed ones causing the same problem, as long as there are enough requests. :-) – Jono Oct 12 '17 at 13:30
  • All the answers seem to be apply an approximate delay to achieve the desired outcome. I like the idea of restricting the number of calls per time to the API limit and will pursue that. – HankCa May 11 '19 at 23:17
  • https://stackoverflow.com/a/47627820/1019307 seems to be what I'm looking for. – HankCa May 12 '19 at 00:41

4 Answers4

5

OK, rather than recursing the call to rp etc, just make sure you delay between requests by an appropriate amount ... for 5 per second, that's 200ms

function waitRetryPromise() {
    let promise = Promise.resolve();
    return function rp(options) {
        return promise = promise
        .then(() => new Promise(resolve => setTimeout(resolve, 200)))
        .then(() => request_promise(options));
    }
}
const rp = waitRetryPromise();
Jaromanda X
  • 53,868
  • 5
  • 73
  • 87
  • I don't think this will have the desired effect though I'll try. Since many are queued at the same time won't they just all have a delay and then start all at the same time anyways? – Jono Oct 12 '17 at 13:33
  • Yes. I keep messing you the code for some reason. Should've seen my first attempt!!! So derp – Jaromanda X Oct 12 '17 at 13:48
  • changed `return promise.then....` to `return promise = promise.then....` to make it work – Jaromanda X Oct 12 '17 at 13:50
4

My code will run the TimedQueue so as long as there is work to be done. The process() method resolves when all work is finished:

class Queue {
    constructor() {
        this.queue = [];
    }

    enqueue(obj) {
        return this.queue.push(obj);
    }

    dequeue() {
        return this.queue.shift();
    }

    hasWork() {
        return (this.queue.length > 0);
    }
}

function t_o(delay) {
    return new Promise(function (resolve, reject) {
        setTimeout(function () {
            resolve();
        }, delay);
    });
}

class TimedQueue extends Queue {
    constructor(delay) {
        super();
        this.delay = delay;
    }

    dequeue() {
        return t_o(this.delay).then(() => {
            return super.dequeue();
        });
    }
    
    process(cb) {
        return this.dequeue().then(data => {
            cb(data);
            if (this.hasWork())
                return this.process(cb);
        });
    }
}


var q = new TimedQueue(500);

for (var request = 0; request < 10; ++request)
    q.enqueue(request);

q.process(console.log).then(function () {
    console.log('done');
});
Rafael
  • 7,605
  • 13
  • 31
  • 46
  • In your case, change the `TimedQueue`'s delay to `200` (1/5) of a second. – Rafael Oct 12 '17 at 04:41
  • haven't read the code in detail (I like this better than what I was going to post in some ways), but if `q.enqueue` is called after `q.process` will it still work? if `q.enqueue` is called after `console.log('done');`, do you have to `q.process` again? – Jaromanda X Oct 12 '17 at 04:55
  • Sorry, yes, you can add work while the queue is processing and it will continue to run, eventually processing the newly enqueued item, without a second call to process. – Rafael Oct 12 '17 at 04:57
  • no, you made a valid point, I should've done my own test - once the queue is empty, adding to it means you need to process again (see, I tested it :p ) – Jaromanda X Oct 12 '17 at 04:59
  • Process is unique, if the TimedQueue is *processing*, enqueing an item between delays *will not* need a second call to process. If the queue finished beforehand, then yes, you will have to requeue and reprocess. The beauty is that you don't have to write any fancy code to determine that. If the promise was resolved, then you know, you have to reprocess, if it hasn't, then freely enqueue. – Rafael Oct 12 '17 at 05:01
  • 1
    @Raf I like this solution but I like the other one more, it requires less understanding/complexity. However, please leave it here - it's great and may be exactly what some are looking for. – Jono Oct 12 '17 at 17:54
0

You can use mutex to synchronize concurrent processes running on different threads and limit them by timer:

import { Mutex } from 'async-mutex';

const wait = (ms: number): Promise<void> => new Promise((res) => setTimeout(res, ms));

export class RateLimiter {
    private readonly mutex = new Mutex();

    private delay = 1000 / 2;

    public async run<T>(cb: () => Promise<T>): Promise<T> {
        const release = await this.mutex.acquire();

        try {
            return cb();
        } finally {
            // you can add `await` before to delay the result
            wait(this.delay).then(release);
        }
    }
}

An example usage:

// ...NestJS Controller
private readonly rateLimiter = new RateLimiter();

@Get('random')
public async random() {
    const result = await this.rateLimiter.run(async () => Math.random());

    return result;
}

The mutex lib documentation

zemil
  • 3,235
  • 2
  • 24
  • 33
-2

I am not sure but maybe you get some idea from below

function placeAnOrder(orderNumber) {
    console.log("customer order:", orderNumber)
    cookAndDeliverFood(function () {
        console.log("food delivered order:", orderNumber);
    });
}

//  1 sec need to cook

function cookAndDeliverFood(callback){
    setTimeout(callback, 1000);
}

//users web request
placeAnOrder(1);
placeAnOrder(2);
placeAnOrder(3);
<script src="https://ajax.googleapis.com/ajax/libs/jquery/2.1.1/jquery.min.js"></script>
R.K.Bhardwaj
  • 2,168
  • 1
  • 14
  • 24
  • Hey Raj, thanks for posting but your code is in some ways is an interior of my code that I already have. It won't solve the problem as all your doing is making each request take 1 second but all requests (if spawned asynchronously) will still happen in the same second. – Jono Oct 12 '17 at 14:20