3

Im having a feed that is password protected. Below is the code used to access the feed

$url = 'http://thefeedwebsite.com/feed.php';

$data = array("username" => ‘user’, "password" => ‘password’, "location" => "HK")
$ch = curl_init($url);

);

curl_setopt($ch, CURLOPT_POST, true);
curl_setopt($ch, CURLOPT_POSTFIELDS, $data);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 0);


$output = curl_exec($ch);

curl_close($ch);

The problem is that due to large size after outputting about 100 results it keeps on timeout. I have set the time limit in my php.ini as some threads suggested but still the same issue. I think its because CURL loads the complete feed to the memory.

Is it possible to load the $output directly to XMLReader() in php so I can process the feed through the reader faster?

Sorry is the question is totally noob. Just started learning php with xml

hakre
  • 193,403
  • 52
  • 435
  • 836

2 Answers2

2

This thread could help you (streaming cURL and playing with memory):

Manipulate a string that is 30 million characters long

The first answer stores it in the file. The second one is streaming data "as they go they flow". If the files is really huge you should consider XML parser which you want to use. Some loads whole xml to a memory and creates an object but others can just provide the interface methods you can work with the XML on the fly (without loading whole XML into memory).

Community
  • 1
  • 1
kuncajs
  • 1,096
  • 1
  • 9
  • 20
0

If time limit (http://php.net/manual/en/function.set-time-limit.php) is not your issue, have you considered that you could be running out of memory?

http://www.php.net/manual/en/ini.core.php#ini.memory-limit

zbtirrell
  • 21
  • 1