0

How can I make, say N url calls in parallel, and process the responses as they come back?

I want to ready the responses and print them to the screen, maybe after some manipulations. I don't care about the order of the responses.

Cœur
  • 37,241
  • 25
  • 195
  • 267
eran
  • 14,496
  • 34
  • 98
  • 144
  • here's [a solution using `multiprocessing.dummy`](http://stackoverflow.com/a/14594205/4279), another [example](http://stackoverflow.com/a/16750675/4279) – jfs Jun 16 '13 at 16:04

2 Answers2

1

You can use Twisted Python for this, such as in the example here: https://twistedmatrix.com/documents/13.0.0/web/howto/client.html#auto3

Twisted is an asynchronous programming library for Python which lets you carry out multiple actions "at the same time," and it comes with an HTTP client (and server).

John Zwinck
  • 239,568
  • 38
  • 324
  • 436
0

One basic solution that comes to mind is to use threading.

Depending of the number of URL you retrieve in parallel, you could have one thread per URL. Or (scale better), have a fixed number of "worker" threads, reading the URL from a shared Queue.

Sylvain Leroux
  • 50,096
  • 7
  • 103
  • 125