I'm using curl
to scrap two websites, both of them with the same php
script(that is ran every 30 min by a cron job). The request is very simple:
//website 1
$ch = curl_init();
$url = 'url';
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
$result = curl_exec($ch);
curl_close($ch);
//website 2
$ch2 = curl_init();
$url2 = 'url';
curl_setopt($ch2, CURLOPT_URL, $url2);
curl_setopt($ch2, CURLOPT_RETURNTRANSFER, true);
$result2 = curl_exec($ch2)
curl_close($ch2);
My question/s is: what is the best practice in cases like this to prevent running out of memory(didn't happen yet but who knows) and to maximize execution speed?
Is there a way to clean memory after each curl
request?
Thank you! :D