2

Is there any way to still make use of all the feeds but instead of loading all the 25 posts of every feed (<entry></entry> or <item></item>), to get the first 10 posts of every feed.

$feeds = array('',''); //a looot of inputs

$entries = array();
foreach ($feeds as $feed) {

    $xml = simplexml_load_file($feed);
    $xml->registerXPathNamespace('f', 'http://www.w3.org/2005/Atom');

    $entries = array_merge($entries, $xml->xpath('/f:feed/f:entry | /rss/channel//item'));

}
EnexoOnoma
  • 8,454
  • 18
  • 94
  • 179
  • 3
    @madflow that is `asp` and not `php` – Mob Oct 02 '11 at 19:11
  • @mob You may say that he refers to the xPath, – EnexoOnoma Oct 02 '11 at 19:13
  • @madflow Does it have any purpose if I still load all the feeds first? `$xml = simplexml_load_file($feed);` – EnexoOnoma Oct 02 '11 at 19:13
  • Right - sorry - My point being : This must have been already answered. Still an interesting question though ;) – madflow Oct 02 '11 at 19:14
  • What are you actually trying to speed up? It seems your approach will be retrieving each feed sequentially, rather than in parallel. – salathe Oct 02 '11 at 19:29
  • How about getting the feeds, storing them in an array and then slice the array http://php.net/manual/en/function.array-slice.php – Mob Oct 02 '11 at 19:32
  • @Mob I do not know... Isn't a double procedure/performance to load the entire feeds in an array and then slice it? Will this help in output? I updated my question. – EnexoOnoma Oct 02 '11 at 19:41

1 Answers1

0

Try this:

$entries = array();
foreach ($feeds as $feed) {

    $xml = simplexml_load_file($feed);
    $xml->registerXPathNamespace('f', 'http://www.w3.org/2005/Atom');

    $entries = array_merge($entries, $xml->xpath('/f:feed/f:entry[position() <= 10] | /rss/channel//item[position() <= 10]') );
}

This uses position().

Decent Dabbler
  • 22,532
  • 8
  • 74
  • 106
  • Even by this the entire documents are loaded, but the size of the array is reduced right? Does this speed things up? – EnexoOnoma Oct 02 '11 at 21:42
  • @mtopia: somewhat I think, depending on how many feeds you are reading. But if you only want to partially load XML files (feeds) I think you may want to look into [`XMLReader`](http://php.net/manual/en/book.xmlreader.php). Although I must admit: I haven't been able to figure out how it works exactly. I believe it *does* however allow you to do what you want though: reading an XML stream element by element, so to speak, and not read the whole document at once. – Decent Dabbler Oct 02 '11 at 22:19
  • I tried it in another app I was building but the speed of it was awful. – EnexoOnoma Oct 02 '11 at 22:26
  • @mtopia: really? Hmmm... was that application fetching the xml from some other server by any chance (as I suppose your current script is doing as well?)? It's just a guess, but I would think reading a stream partially, element by element, from another server, could indeed cause it to slow down. `XMLReader` is probably more useful on local filesystem streams than on external streams. But to be honest: that's just an educated guess. Don't take my word for it. – Decent Dabbler Oct 03 '11 at 06:46
  • No it was reading from a local huge file. I read some benchmarks about it so I tried it. I tested many codes but nothing beated simpleXML. – EnexoOnoma Oct 03 '11 at 13:50