I'm using a simple PHP script on Altervista provider to get data from a very big HTML table (more than 6300 rows) at this link.
The problem is the "Maximum execution time of 30 seconds exceeded" during the rows loop.
I'd like to get XML data or even plain CSV text data, is there a faster way instead of looping each row?
<?php
set_time_limit(3000);
ini_set('max_execution_time', 3000);
function XML_Append($XML,$Q,$Sex,$TabCnt,$TabName) {
$pagecontent = file_get_contents($Q);
echo "DONE fetch";
$doc = new DOMDocument();
$doc->preserveWhiteSpace = false;
$doc->loadHTML($pagecontent);
$tables = $doc->getElementsByTagName('table');
$rows = $tables->item($TabCnt)->getElementsByTagName('tr');
$rowLen=$rows->length;
echo $rowLen;
for($ir = 0; $ir < $rowLen; ++$ir) {
echo $ir . "\r\n";
$row=$rows[$ir];
}
unset($doc);
}
$QUERY_SPRINT_FEMMINILE="http://risultati.fitri.it/rank.asp?Anno=%ANNO%&TRank=S&Ss=F&PunDal=0.00&PunAl=999.99";
$QUERY_SPRINT_MASCHILE="http://risultati.fitri.it/rank.asp?Anno=%ANNO%&TRank=S&Ss=M&PunDal=0.00&PunAl=999.99";
$QUERY="";
$ANNO="";
if (isset($_GET['Anno'])) {
$ANNO= $_GET['Anno'];
} else {
$ANNO="2019";
}
$QUERY=str_replace("%ANNO%",$ANNO,$QUERY_SPRINT_MASCHILE);
$xml = new SimpleXMLElement('<DocumentElement/>');
XML_Append($xml,$QUERY,"M",1,"SP");
echo "DONE";
?>
the loop code is:
foreach ($rows as $row)
{
$xmlTable = $XML->addChild($TabName);
$xmlTable->addChild('_S', $Sex);
$cols = $row->getElementsByTagName('td');
$colLen=$cols->length;
for($i = 0; $i < $colLen; ++$i) {
$NomeColonna="C" . $i;
$value= $cols->item($i)->nodeValue;
$value=trim(str_replace(PHP_EOL, "", $value));
$value=str_replace("\xc2\xa0","",$value);
$xmlTable->addChild($NomeColonna,$value);
}
}