I have the following function in PHP that reads URL of pages from an array and fetches the HTML content of the corresponding pages for parsing. I have the following code that works fine.
public function fetchContent($HyperLinks){
foreach($HyperLinks as $link){
$content = file_get_html($link);
foreach($content->find('blablabla') as $result)
$this->HyperLink[] = $result->xmltext;}//foreach
return($this->HyperLink);
}
the problem with the code is that it is very slow and take 1 second to fetch content and parse its content. Considering very large number of files to read, I am looking for a parallel model of the above code. The content of each page is just few kilobyte.
I did search and found exec command but cannot figure out how to do it. I want to have a function and call it in parallel for N times so the execution takes less time. The function would get one link as input like below:
public function FetchContent($HyperLink){
// reading and parsing code
}
I tried this exec could:
print_r(exec("FetchContent",$HyperLink ,$this->Title[]));
but no way. I also replaced "FetchContent" with "FetchContent($HyperLink)" and removed second para, but neither works.
Thanks. Pls let me know if anything is missing. You may suggest anyway that helps me quickly process the content of numerous files at least 200-500 pages.