2

Possible Duplicate:
Asynchronous HTTP requests in PHP

I have a script that uses a foreach loop and takes an item id from a file and checks a head request if the status is 200. It is going to take a long time for this script to run, so is there any way I can do multiple requests at one time? I know with ajax if i do a for loop with $.ajax() it will start them all at once. how can I get this behavior with php or is there an alternative to allow for batch processing?

Community
  • 1
  • 1
Steve
  • 93
  • 2
  • 9

3 Answers3

4

Use curl_multi_* functions. See http://www.php.net/manual/en/function.curl-multi-exec.php

Josef Kufner
  • 2,851
  • 22
  • 28
0

A single request is easy (like you note). Justloop through and Use the Pear library HTTP_Request2 and make a HEAD request:

  $request = new HTTP_Request2('http://www.your_url.com/',
                         HTTP_Request2::METHOD_HEAD);
  $response = $request->send();
  if (200 == $response->getStatus()) {
    echo "Yippie... a 200";
  } 

You can also use the php curl library, but it's much more futz to deal with: Header only retrieval in php via curl

However, to do it in soley PHP in parallel you'd need to do use something like an exec() to fork off other php scripts as PHP does not support threaded programming or concurrency natively. Similarly, you could also use some kind of custom job/worker queue using gearman or something, but that's pretty involved.

Community
  • 1
  • 1
Ray
  • 40,256
  • 21
  • 101
  • 138
0

Be careful! If you know it takes a lot of time, you'd probably requesting about hundreds or thousands of URLs. Be sure not to overwhelm the target server.

Also notice that depending on the code running on the target, the amount of work that needs to be done for a HEAD request might be completely the same as for a GET request. So the only optimization would be to not transfer the data to you. This might be mistaken for a denial of service attack.

Sven
  • 69,403
  • 10
  • 107
  • 109