2

I have a php script which queries a list of clients from a mysql database, and goes to each client's IP address and picks up some information which is then displayed on the webpage.

But, it takes a long time, if the number of clients is too high. Is there anyway, I can send those url requests (file_get_contents) in parallel?

roymustang86
  • 8,054
  • 22
  • 70
  • 101
  • How can a script 'go' to all these clients? Aren't they in fact the server then? I'm not sure what you're trying to create here, but shouldn't it be better if all clients would push their status on regular intervals? Then, the PHP scrips has this data available already when you request it. Please elaborate. – GolezTrol Jul 28 '11 at 14:09
  • 1
    Duplicate of http://stackoverflow.com/questions/209774/does-php-have-threading – Candide Jul 28 '11 at 14:10
  • @Goleztrol- Sorry about the wrong wording. It picks the url and does a file_get_contents on that url. It waits for each file_get_contents to respond back. I want to launch all of them in parallel, so that I can get all their results back in time – roymustang86 Jul 28 '11 at 14:17

2 Answers2

2

Lineke Kerckhoffs-Willems wrote a good article about Multithreading in PHP with CURL. You can use that instead of file_get_contents() to get needed information.

1

I would use something like Gearman to assign them as jobs in a queue for workers to come along and complete if this needs to scale.

As another option I have also written a PHP wrapper for the Unix at queue, which might be a fit for this problem. It would allow you to schedule the requests so that they can run in parallel. I have used this method successfully in the past to handle the sending of bulk email, which has similar blocking problems to your script.

Treffynnon
  • 21,365
  • 6
  • 65
  • 98