2

I'm writing a Rails web service that interacts with various pieces of hardware scattered throughout the country.

When a call is made to the web service, the Rails app then attempts to contact the appropriate piece of hardware, get the needed information, and reply to the web client. The time between the client's call and the reply may be up to 10 seconds, depending upon lots of factors.

I do not want to split the web service call in two (ask for information, answer immediately with a pending reply, then force another api call to get the actual results).

I basically see two options. Either run JRuby and use multithreading or else run several regular Ruby instances and hope that not many people try to use the service at a time. JRuby seems like the much better solution, but it still doesn't seem to be mainstream and have out of the box support at Heroku and EngineYard. The multiple instance solution seems like a total kludge.

1) Am I right about my two options? Is there a better one I'm missing? 2) Is there an easy deployment option for JRuby?

Kyle Heironimus
  • 7,741
  • 7
  • 39
  • 51

3 Answers3

2

I do not want to split the web service call in two (ask for information, answer immediately with a pending reply, then force another api call to get the actual results).

From an engineering perspective, this seems like it would be the best alternative.

Why don't you want to do it?

yfeldblum
  • 65,165
  • 12
  • 129
  • 169
  • That was my first thought. Making it a background task, and then have the client poll for updates is pretty much the defacto approach to this sort of problem. – Paul Leader Dec 30 '10 at 23:18
1

There's a third option: If you host your Rails app with Passenger and enable global queueing, you can do this transparently. I have some actions that take several minutes, with no issues (caveat: some browsers may time out, but that may not be a concern for you).

If you're worried about browser timeout, or you cannot control the deployment environment, you may want to process it in the background:

  1. User requests data
  2. You enter request into a queue
  3. Your web service returns a "ticket" identifier to check the progress
  4. A background process processes the jobs in the queue
  5. The user polls back, referencing the "ticket" id

As far as hosting in JRuby, I've deployed a couple of small internal applications using the glassfish gem, but I'm not sure how much I would trust it for customer-facing apps. Just make sure you run config.threadsafe! in production.rb. I've heard good things about Trinidad, too.

John Douthat
  • 40,711
  • 10
  • 69
  • 66
  • Is the ticket method you mention used much? It makes for a more awkward API from a clients perspective, but I can see its virtues. – Kyle Heironimus Dec 30 '10 at 21:19
  • 1
    indeed it is. (e.g. http://stackoverflow.com/questions/1835652/long-running-webservice-architecture http://stackoverflow.com/questions/3477561/common-ways-of-handling-long-running-process-in-asp-net http://stackoverflow.com/questions/3940231/how-to-build-a-time-consuming-web-service) – John Douthat Dec 30 '10 at 23:32
0

You can also run the web service call in a delayed background job so that it's not hogging up a web-server and can even be run on a separate physical box. This is also a much more scaleable approach. If you make the web call using AJAX then you can ping the server every second or two to see if your results are ready, that way your client is not held in limbo while the results are being calculated and the request does not time out.

Pan Thomakos
  • 34,082
  • 9
  • 88
  • 85