1

thank you for taking a look at this.

I am new to rails, unfortunately. I currently have to implement an endpoint that Superfeedr can push updates to, but that endpoint has to be in a rails controller.

Initially it seemed to me that this should be a background job that runs and the rest of the web tends to agree, but I am being pressured to include this as a rails controller - which confuses me. I am not certain how to include EventMachine in a request/response cycle.

I know the web is full of examples, but none really answer my question with how to route this. I have no idea.

I have a rails controller, called Superfeeds. I want Superfeeder to push updates to something like myrailsapp/superfeeds/

Inside feeds I want to inspect the pushed content, and then write the results of that to another controller that actually has a model and will persist it.

Basically, the controller called Feeds just needs to receive and pass the information along. This confuses me however because it seems to have to implement something which is a long running process inside of a rails controller - and I am not even sure if this can work.

Does anyone know of a way that this has been done with rails, but not using EventMachine as a background job? In the end I really just need to know that this is possible.

-L

LRH
  • 53
  • 3
  • The Superfeedr-ruby gem uses the Superfeedr XMPP API. If you're building a web app (since you're using Rails, I assume this is the case), you should really try to use the PubSubHubbub API, which works like a charm with rails, as it's a Rest API. – Julien Genestoux Nov 22 '11 at 08:19

1 Answers1

0

Inside feeds I want to inspect the pushed content, and then write the results of that to another controller that actually has a model and will persist it.

Why not do all the work in the one controller? If you're trying to separate out different concerns, you could even use two models - for instance one to do the inspecting/parsing and one to handle the persisting. But rarely would you need or want to to pass data from controller to controller.

I am not certain how to include EventMachine in a request/response cycle.

Never used superfeedr myself, but glanced at the docs quickly - are you using the XMPP or PubSubHubbBub client? I assume the latter? If so, you want to do the persistence (and any other time-consuming process) async (outside the request/resp cycle), right?

If you are using an EventMachine-based webserver such as Thin, basically every request cycle is run within an EM reactor. So you can make use of EM's facilities for offloading tasks such as the deferred thread pool. For an example of this in action, check out Enigmamachine, in particular here. (I believe that in addition your db client library needs to be asynchronous.)

Eric G
  • 1,282
  • 8
  • 18
  • Yes I am trying implement a PubSubHubBub client with that gem and I do want these updates to be async.. It appears that the XMPP client will not work in our scenerio. I was told that this controller should accept the pushed content from Superfeedr and inspect it for relevant information, if relevant information is found I was told to post or send that to another controller that is ready and willing to accept that information. I am all for doing this right, so if this sounds odd forgive me because I am naive. – LRH Oct 19 '11 at 23:50
  • Well, I don't quite understand why you would want to do that, since then you're adding another network hop on top of the time to write the data, to the request. The PSHB docs say _[the subscriber] should not fully process a notification upon delivery, but store it somewhere for processing asynchronously at a later time..._ So to me that means, you should find a way to offload the writing to file. [Ruby Rogues](http://rubyrogues.com/queues-and-background-processing/) had a good discussion of some options, in addition to the EM.defer option I mentioned. – Eric G Oct 20 '11 at 01:09
  • PS. You may also want to ask on the superfeedr google group, I'm sure it's a common scenario. – Eric G Oct 20 '11 at 01:13