I use an external API to gather data for an object. The way the external API is structured, I must make 5 separate calls to different parts of the API in order to get all the data I need.
Up to now I've been doing this sequentially, however it takes about 5 seconds for each iteration, making for a long wait when multiple objects are being handled.
What I'd like to do is fork the 5 tasks out to separate processes, let them each contact the API, and then update an array with their part of the dataset.
I'm looking at pcntl_* functions to do this, however I'm not sure what the best practice is for doing it efficiently.
Basically, validating that the object data exists in the remote system, I wish to call five functions in parallel. Func1 - func5, each of which are passed the object's ID and I suppose some way of indicating which of the 5 operations the process needs to perform.
The procs are dispatched and the parent waits until all five return, having added their results to the global array.
At that point, the parent sends back the array to the requester and dies. In psuedocode, I'm thinking the function that would handle the forking would work something like this:
function LoadObjectData( $oid ) {
$data = array();
// fork our individual operations on the object
forkit( $oid, FUNC_ONE, &$data );
forkit( $oid, FUNC_TWP, &$data );
forkit( $oid, FUNC_THREE, &$data );
forkit( $oid, FUNC_FOUR, &$data );
waitForForksToFinish();
return $data;
}
The second argument would be some sort of indicator of which function the child needs to do, which I'm thinking would just be 4 other functions within the file.
Is pcntl_* the best way to go with this? Would this be the best structure for the code? Which actual pcntl_* calls would be required?
Thanks.