74

I'm use mikeal/request to make API calls. One of the API's I use most frequently (the Shopify API). Recently put out a new call limit, I'm seeing errors like:

Exceeded 6.0 calls per second for api client. Slow your requests or contact support for higher limits.

I've already gotten an upgrade, but regardless of how much bandwidth I get I have to account for this. A large majority of the requests to the Shopify API are within async.map() functions, which loop asynchronous requests, and gather the bodies.

I'm looking for any help, perhaps a library that already exists, that would wrap around the request module and actually block, sleep, throttle, allocate, manage, the many simultaneous requests that are firing off asynchronously and limit them to say 6 requests at a time. I have no problem with working on such a project if it doesn't exist. I just don't know how to handle this kind of situation, and I'm hoping for some kind of standard.

I made a ticket with mikeal/request.

ThomasReggi
  • 55,053
  • 85
  • 237
  • 424
  • No kidding. I finally got fed up with the ElasticTranscoder UI and build up code to use the API through the JS SDK and instantly hit these limits. – rainabba Feb 17 '16 at 11:59
  • 1
    In 2018 there is [rate-limiter-flexible](https://github.com/animir/node-rate-limiter-flexible) package which does the job – Animir Jul 02 '18 at 12:42
  • can anyone give java solution also – nil96 Jun 04 '20 at 07:49

10 Answers10

38

For an alternative solution, I used the node-rate-limiter to wrap the request function like this:

var request = require('request');
var RateLimiter = require('limiter').RateLimiter;

var limiter = new RateLimiter(1, 100); // at most 1 request every 100 ms
var throttledRequest = function() {
    var requestArgs = arguments;
    limiter.removeTokens(1, function() {
        request.apply(this, requestArgs);
    });
};
Dmitry Chornyi
  • 1,821
  • 4
  • 25
  • 33
  • I'm going to look into this! Thanks a bunch! – ThomasReggi Jul 30 '14 at 15:26
  • 22
    Author of node-rate-limiter here. This library is probably going to be a better fit for the stated problem, since async.queue() only places limits on concurrency and has no concept of time. API rate limits are generally time-based (ie max of 6 calls per second) which can be expressed as `var limiter = new RateLimiter(6, 'second');` It is complementary to a solution like oibackoff which will change behavior after a rate limit has been hit. – jhurliman Aug 28 '15 at 21:16
  • Can i do it for all request as whole or need to do it individually ? I mean can i put it inside my middleware ? If yes, how it will be applied, for all endpoints or for each endpoints ? – Kamalakannan J Oct 05 '15 at 09:13
  • 2
    Does this only limits calls or also works as a queueing mechanism. Meaning if I exceed the limit, it will queue the request and start calling again once the limit is refreshed? – ChickenWing24 Nov 19 '15 at 02:55
  • Yes, it will queue. I think what happens is `node-rate-limiter` calls you back once the token becomes available – Dmitry Chornyi Jan 22 '16 at 00:16
24

The npm package simple-rate-limiter seems to be a very good solution to this problem.

Moreover, it is easier to use than node-rate-limiter and async.queue.

Here's a snippet that shows how to limit all requests to ten per second.

var limit = require("simple-rate-limiter");
var request = limit(require("request")).to(10).per(1000);
Camilo Sanchez
  • 1,528
  • 1
  • 14
  • 18
21

I've run into the same issue with various APIs. AWS is famous for throttling as well.

A couple of approaches can be used. You mentioned async.map() function. Have you tried async.queue()? The queue method should allow you to set a solid limit (like 6) and anything over that amount will be placed in the queue.

Another helpful tool is oibackoff. That library will allow you to backoff your request if you get an error back from the server and try again.

It can be useful to wrap the two libraries to make sure both your bases are covered: async.queue to ensure you don't go over the limit, and oibackoff to ensure you get another shot at getting your request in if the server tells you there was an error.

Ibrahim Assal
  • 33
  • 1
  • 7
Dan
  • 3,389
  • 5
  • 34
  • 44
  • 1
    I'm gonna dig deep into those two suggestions. The only problem I have is that my `async.maps` are spread out and nested within each other. So I can't just replace them with `async.queue` because I would would still not guarantee that the requests to the API would be 6 at a time. They would be 6 * each `async.queue`. But I think the ball is rolling? – ThomasReggi Nov 28 '13 at 00:49
  • 3
    https://caolan.github.io/async/docs.html#queue won't throttle (per sec/ min). It is just number of asynchronous operations. – Manohar Reddy Poreddy Dec 04 '17 at 06:32
10

My solution using modern vanilla JS:

function throttleAsync(fn, wait) {
  let lastRun = 0;

  async function throttled(...args) {
    const currentWait = lastRun + wait - Date.now();
    const shouldRun = currentWait <= 0;

    if (shouldRun) {
      lastRun = Date.now();
      
      return await fn(...args);
    } else {
      return await new Promise(function(resolve) {
        setTimeout(function() {
          resolve(throttled(...args));
        }, currentWait);
      });
    }
  }

  return throttled;
}

// Usage:

const run = console.log.bind(console);
const throttledRun = throttleAsync(run, 1000);

throttledRun(1); // Will execute immediately.
throttledRun(2); // Will be delayed by 1 second.
throttledRun(3); // Will be delayed by 2 second.
Sebastian Simon
  • 18,263
  • 7
  • 55
  • 75
djanowski
  • 5,610
  • 1
  • 27
  • 17
8

In async module, this requested feature is closed as "wont fix"

There is a solution using leakybucket or token bucket model, it is implemented "limiter" npm module as RateLimiter.

RateLimiter, see example here: https://github.com/caolan/async/issues/1314#issuecomment-263715550

Another way is using PromiseThrottle, I used this, working example is below:

var PromiseThrottle = require('promise-throttle');
let RATE_PER_SECOND = 5; // 5 = 5 per second, 0.5 = 1 per every 2 seconds

var pto = new PromiseThrottle({
    requestsPerSecond: RATE_PER_SECOND, // up to 1 request per second
    promiseImplementation: Promise  // the Promise library you are using
});

let timeStart = Date.now();
var myPromiseFunction = function (arg) {
    return new Promise(function (resolve, reject) {
        console.log("myPromiseFunction: " + arg + ", " + (Date.now() - timeStart) / 1000);
        let response = arg;
        return resolve(response);
    });
};

let NUMBER_OF_REQUESTS = 15;
let promiseArray = [];
for (let i = 1; i <= NUMBER_OF_REQUESTS; i++) {
    promiseArray.push(
            pto
            .add(myPromiseFunction.bind(this, i)) // passing am argument using bind()
            );
}

Promise
        .all(promiseArray)
        .then(function (allResponsesArray) { // [1 .. 100]
            console.log("All results: " + allResponsesArray);
        });

Output:

myPromiseFunction: 1, 0.031
myPromiseFunction: 2, 0.201
myPromiseFunction: 3, 0.401
myPromiseFunction: 4, 0.602
myPromiseFunction: 5, 0.803
myPromiseFunction: 6, 1.003
myPromiseFunction: 7, 1.204
myPromiseFunction: 8, 1.404
myPromiseFunction: 9, 1.605
myPromiseFunction: 10, 1.806
myPromiseFunction: 11, 2.007
myPromiseFunction: 12, 2.208
myPromiseFunction: 13, 2.409
myPromiseFunction: 14, 2.61
myPromiseFunction: 15, 2.811
All results: 1,2,3,4,5,6,7,8,9,10,11,12,13,14,15

We can clearly see the rate from output, i.e. 5 calls for every second.

Manohar Reddy Poreddy
  • 25,399
  • 9
  • 157
  • 140
6

The other solutions were not up to my tastes. Researching further, I found promise-ratelimit which gives you an api that you can simply await:

var rate = 2000 // in milliseconds
var throttle = require('promise-ratelimit')(rate)

async function queryExampleApi () {
  await throttle()
  var response = await get('https://api.example.com/stuff')
  return response.body.things
}

The above example will ensure you only make queries to api.example.com every 2000ms at most. In other words, the very first request will not wait 2000ms.

soundly_typed
  • 39,257
  • 5
  • 28
  • 36
2

Here's my solution use a library request-promise or axios and wrap the call in this promise.

var Promise = require("bluebird")

// http://stackoverflow.com/questions/28459812/way-to-provide-this-to-the-global-scope#28459875
// http://stackoverflow.com/questions/27561158/timed-promise-queue-throttle

module.exports = promiseDebounce

function promiseDebounce(fn, delay, count) {
  var working = 0, queue = [];
  function work() {
    if ((queue.length === 0) || (working === count)) return;
    working++;
    Promise.delay(delay).tap(function () { working--; }).then(work);
    var next = queue.shift();
    next[2](fn.apply(next[0], next[1]));
  }
  return function debounced() {
    var args = arguments;
    return new Promise(function(resolve){
      queue.push([this, args, resolve]);
      if (working < count) work();
    }.bind(this));
  }
ThomasReggi
  • 55,053
  • 85
  • 237
  • 424
1

I use async-sema module handle throttle HTTP request. Which means it allow you send HTTP request with a rate limit.

Here is an example:

A simple Node.js server, add express-rate-limit middleware to API so that the API has rate-limit feature. Let's say this is the Shopify API for your case.

server.ts:

import express from 'express';
import rateLimit from 'express-rate-limit';
import http from 'http';

const port = 3000;
const limiter = new rateLimit({
  windowMs: 1000,
  max: 3,
  message: 'Max RPS = 3',
});

async function createServer(): Promise<http.Server> {
  const app = express();

  app.get('/place', limiter, (req, res) => {
    res.end('Query place success.');
  });

  return app.listen(port, () => {
    console.log(`Server is listening on http://localhost:${port}`);
  });
}

if (require.main === module) {
  createServer();
}

export { createServer };

On client-side, we want to send HTTP requests with concurrency = 3 and per second cap between them. I put the client-side code inside a test case. So don't feel weird.

server.test.ts:

import { RateLimit } from 'async-sema';
import rp from 'request-promise';
import { expect } from 'chai';
import { createServer } from './server';
import http from 'http';

describe('20253425', () => {
  let server: http.Server;
  beforeEach(async () => {
    server = await createServer();
  });
  afterEach((done) => {
    server.close(done);
  });
  it('should throttle http request per second', async () => {
    const url = 'http://localhost:3000/place';
    const n = 10;
    const lim = RateLimit(3, { timeUnit: 1000 });

    const resArr: string[] = [];
    for (let i = 0; i < n; i++) {
      await lim();
      const res = await rp(url);
      resArr.push(res);
      console.log(`[${new Date().toLocaleTimeString()}] request ${i + 1}, response: ${res}`);
    }

    expect(resArr).to.have.lengthOf(n);
    resArr.forEach((res) => {
      expect(res).to.be.eq('Query place success.');
    });
  });
});

Test results, Pay attention to the time of the request

  20253425
Server is listening on http://localhost:3000
[8:08:17 PM] request 1, response: Query place success.
[8:08:17 PM] request 2, response: Query place success.
[8:08:17 PM] request 3, response: Query place success.
[8:08:18 PM] request 4, response: Query place success.
[8:08:18 PM] request 5, response: Query place success.
[8:08:18 PM] request 6, response: Query place success.
[8:08:19 PM] request 7, response: Query place success.
[8:08:19 PM] request 8, response: Query place success.
[8:08:19 PM] request 9, response: Query place success.
[8:08:20 PM] request 10, response: Query place success.
    ✓ should throttle http request per second (3017ms)


  1 passing (3s)
Lin Du
  • 88,126
  • 95
  • 281
  • 483
1

So many great options here, also here is the one that i am using in one of my projects.

axios-request-throttle

Usage:

import axios from 'axios';
import axiosThrottle from 'axios-request-throttle';

axiosThrottle.use(axios, { requestsPerSecond: 5 });
0

I was searching for snippet of async function throttling and found none such that takes into the account the invocation state.

If you do not want to allow parallel function invocations and do not allow the function to be called some duration after it finishes, here is the code I came up with.

/**
 * Throttles async function. Takes into account the function call duration and waits
 * extra wait milliseconds after the function call is done.
 * If the throttled function is called during the execution or wait state the call
 * arguments are stored and the last ones are used to call the function at the end
 * of the waiting state.
 *
 * Throttling example of function having one number argument and wait = 5000ms:
 * <pre>
 *   Time                       0s         3s            8s             15s   20s
 *   Outside call (argument)    1               2   3  4    5
 *   Inside call (argument)     1 -------> OK            4 -----------> OK    5 ------> OK
 * </pre>
 *
 * @param func function to throttle
 * @param wait waiting duration after the function call is finished
 */
export function throttleAsync<A extends unknown[], R>(func: (...args: A) => Promise<R>, wait: number) {
  // currently invoked function promise
  let promise: Promise<R> | undefined;

  // last call arguments during the function invocation or waiting state
  let lastDefferedArgs: A | undefined;

  function throttled(...args: A) {
    // function is not running and we are not waiting
    if (!promise) {
      // invoke the function
      promise = func(...args).finally(() => {
        // invocation is done, now wait extra 'wait' milliseconds
        window.setTimeout(() => {
          // then set the promise to undefined allowing subsequent invocations
          promise = undefined;
          const defferedArgs = lastDefferedArgs;
          lastDefferedArgs = undefined;

          // there was some deffered invocation - invoke now with latest arguments
          if (defferedArgs) {
            throttled(...defferedArgs);
          }
        }, wait);
      });
    } else {
      // function is running or we are waiting - store arguments to deffered invocation
      lastDefferedArgs = args;
    }
  }

  return throttled;
}

Codepen

Petr Újezdský
  • 1,233
  • 12
  • 13