4

I'm looking into writing a proxy aggregator for feeds, where a handful of users specify a feed URL and some set of conditions, and the system outputs a continually-updated RSS/Atom feed of entries that match those conditions.

Are there best practices for feed aggregators? (Or filtering feed proxies?)

For example:

  • Are there certain feed elements that should or shouldn't be modified be proxies?
  • How should a feed proxy/parser indicate that it's not passing along a pristine copy of the original feed?
  • Does it make sense to delegate the work of downloading/updating to a third party aggregator platform, e.g. the Google Feed API? I presume that'll save a lot of work, vs. having to do updates, 301 handling, etc. by myself.

Thanks for your help.

Dan Lowe
  • 51,713
  • 20
  • 123
  • 112
Anirvan
  • 6,214
  • 5
  • 39
  • 53

3 Answers3

2

Do not query any feed more frequently than 30 minutes. Use caching.

-Adam

Adam Davis
  • 91,931
  • 60
  • 264
  • 330
1

Don't get bought by Ask.com

Sam Hasler
  • 12,344
  • 10
  • 72
  • 106
1

You could use also Yahoo Pipes, I guess... Or this one: planetplanet.org