0

I'm trying to fetch statistical data from a web service. Each request has a response time of 1-2 seconds and I've to submit the request for thousands of IDs, one at a time. All requests would sum up to a few hours, because of the server's response time.

I want to parallelize as much requests as possible (the server's can handle it). I've installed PHP7 and pthreads (CLI-only), but the maximum number of threads is limited (20 in Windows PHP CLI), so I've to start multiple processes.

Is there any simple PHP based framework/library for multi-process/pthread and job-queue handling? I don't need a large framework like symfony or laravel.

root66
  • 477
  • 5
  • 17

1 Answers1

0

Workers

You could look into using php-resque which doesn't require pthreads.

You will have to run a local Redis server though (could also be remote). I believe you can run Redis on Windows, according to this SO


Concurrent Requests

You may also want to look into sending concurrent requests using something like GuzzleHttp, you can find examples on how to use that here

From the Docs:

You can send multiple requests concurrently using promises and asynchronous requests.

use GuzzleHttp\Client;
use GuzzleHttp\Promise;

$client = new Client(['base_uri' => 'http://httpbin.org/']);

// Initiate each request but do not block
$promises = [
    'image' => $client->getAsync('/image'),
    'png'   => $client->getAsync('/image/png'),
    'jpeg'  => $client->getAsync('/image/jpeg'),
    'webp'  => $client->getAsync('/image/webp')
];

// Wait on all of the requests to complete. Throws a ConnectException
// if any of the requests fail
$results = Promise\unwrap($promises);

// Wait for the requests to complete, even if some of them fail
$results = Promise\settle($promises)->wait();

// You can access each result using the key provided to the unwrap
// function.
echo $results['image']->getHeader('Content-Length');
echo $results['png']->getHeader('Content-Length');
Community
  • 1
  • 1
segFault
  • 3,887
  • 1
  • 19
  • 31