1

So I need to make like 300+ get api calls and I don't want to pass them all at once and strain the server. So I was thinking of maybe doing about 5 asynchronous calls at a time. From what I have read here operation queues sound awesome and are very useful. From the answer from this question the asynchronous example makes an operation queue and passes it into the call. I am assuming if I did something like this I could just make 5 separate queues and funnel my calls into those 5 queues (I'm assuming, haven't actually tried it since I would like to use Alamofire). Is something similar possible with Alamofire?

Community
  • 1
  • 1
boidkan
  • 4,691
  • 5
  • 29
  • 43

1 Answers1

-1

Network requests are automatically managed by the underlying URL Loading System, so Alamofire should be able to handle whatever you throw at it. There are ways to schedule Alamofire requests on a queue, but it shouldn't be necessary. Always better to try and measure results rather than speculate.

mattt
  • 19,544
  • 7
  • 73
  • 84
  • So you are saying that if I try and make 300 gets around the same time that the underlying URL Loading System will make sure that it doesn't flood the server? – boidkan Oct 02 '14 at 15:21
  • All the URL Loading System can ensure is that the _client_ won't bite off more than it can chew. As for the server, 300 isn't a particularly high number of concurrent requests to handle. – mattt Oct 02 '14 at 18:42
  • Yes, well I guess I didn't explain. So it is 300+ per device not 300+ total (and actually now i think it is closer to 600 with my current code). So with a lot of devices and a server that can't handle a lot of traffic this would be a problem in my case. I'm not really worried about the client side. – boidkan Oct 02 '14 at 19:25
  • No, I understood that. If your server is indeed incapable of handling real load, I would recommend that you take time to refactor things to combine and reduce the number of calls made. – mattt Oct 03 '14 at 17:19