The following code is in a loop. Each loop changes URL to a new address. My problem is that each pass takes up more and more memory.
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, 'http://site.ru/');
curl_setopt($ch, CURLOPT_TIMEOUT, 60);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_AUTOREFERER, 'http://site.ru/');
curl_setopt($ch, CURLOPT_HEADER, false);
$html = new \DOMDocument();
$html->loadHTML(curl_exec($ch));
curl_close($ch);
$ch = null;
$xpath = new \DOMXPath($html);
$html = null;
foreach ($xpath->query('//*[@id="tree"]/li[position() > 5]') as $category) {
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $xpath->query('./a', $category)->item(0)->nodeValue);
curl_setopt($ch, CURLOPT_TIMEOUT, 60);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_AUTOREFERER, 'http://site.ru/');
curl_setopt($ch, CURLOPT_HEADER, false);
$html = new \DOMDocument();
$html->loadHTML(curl_exec($ch));
curl_close($ch);
$ch = null;
// etc.
}
The memory is 2000 Mb. Script execution time ~ 2h. PHP version 5.4.4. How to avoid memory leak? Thanks!