0

I need to send requests to REST API endpoint, but there is a limit (imposed by the API) - I cannot send more than 10 MB/s.

Every request has a fixed size (let's assume 1 MB for simplicity).

I could create a collection of 10 requests and wait until all of them are finished, and if less than second passed, I would wait before sending another round of requests.

This would be fine, and I found this question that deals with this problem.

However, I am not limited by number of calls per second, but rather by data per second!

This means that if some of the requests are not yet finished, they might be still sending the data. This means that I would have to wait for all the requests to finish in order to start another round.

Edge case scenario might be 1 request that takes long time (e.g. 5 seconds), whereas all other took 0.9 second. I could start another round of 9 requests while the 5 seconds request takes its time to finish!

Unfortunately, all of the solutions that I found focus on either limiting data (but for streams) or amount of requests, but not both.

How can I ensure that I only use 10 MB/s when sending http requests, but don't get blocked by some requests that take longer than others to finish?

jazb
  • 5,498
  • 6
  • 37
  • 44
MatthewRock
  • 1,071
  • 1
  • 14
  • 30
  • Doesn't look promising https://weblog.west-wind.com/posts/2014/Jan/29/Using-NET-HttpClient-to-capture-partial-Responses – Jeremy Thompson Dec 17 '18 at 03:13
  • The 10MB rate is usually set in the Ethernet Card (not application) which has 10MB and 100MB rates. – jdweng Dec 17 '18 at 05:31
  • @mjwills I think so, it limits an access to a critical section by allowing only N actors to enter it at the same time. How did your question help me? – MatthewRock Dec 17 '18 at 18:47
  • 1
    I'd suggest using a semaphore to ensure no more than N concurrent requests. Then use something like https://stackoverflow.com/questions/5852863/fixed-size-queue-which-automatically-dequeues-old-values-upon-new-enques to keep track of the start times of the last N requests (i.e. check the oldest request started more than a second ago). Also consider race conditions (where the 11th and 12th request both come in at the same time, and both see the 1st completed time, when really the 11th should see the 1st and the 12th should see the 2nd). – mjwills Dec 18 '18 at 00:26
  • @mjwills Thanks. I was thinking of some similar solution, but I was hoping that maybe there is a simpler (less manual work) way to do it. Thank you for your insight! – MatthewRock Dec 18 '18 at 20:33

0 Answers0