1

I am creating a search page that displays up to 9 rates. On the frontend, I am sending a request to my rails application that contains the necessary data to grab the 9 rates.

In one of my rails controllers, I crawl a webpage to get the rate. This can take anywhere between 2 and 15 seconds.

I would like run all 9 requests in the background so I may process other requests that come in. For example, the user can make a search and suggested results will display.

I am attempting to use the concurrent-ruby gem with Promises. The cleaned_params variable is an array of data needed to make the request. There are up to 9 requests data.

Here is what I have so far:

 tasks = cleaned_params.map { |request_data| 
    Concurrent::Promises.future(request_data) { |request_data| api_get_rate(request_data) }
  }

  # My tasks could still be in the pending state, all_promises is a new promise that will be fulfilled once all fo the inner promises have been fulfilled
  all_promises = Concurrent::Promises.zip(*tasks)

# Use all_promises.value! to block - I don't want to render a response until we have the rates. 
  render json: {:success => true, :status => 200, :rates => all_promises.value! }

Right now I see that all requests to the api_get_rate are being started, but inside my api_get_rate function, I make a call to a method in another class, BetterRateOverride.check_rate. When I run this same code synchronously, I am successfully able to call the above method, but when I run it how I have it setup right now, my code just hangs once it gets to this call. Why is this happening?

Is it not possible to call a method from another class while in a background thread? Do promises run in background threads? I read that Promises run in the ruby global thread pool.

If this is not the best approach, can you steer me in the right direction?

Thanks for any help.

Edit: I think this may be the issue for my code deadlocking: https://github.com/rails/rails/issues/26847

windfallisland
  • 79
  • 1
  • 11
  • 1
    TBH concurrency doesn't seem like the optimal approach here. There is a chance you "just" need to optimize your DB and code inside `api_get_rate`. Can you provide the code for `api_get_rate` ? – xlembouras Jul 08 '19 at 05:55
  • @xlembouras The api_get_rate function makes a call that crawls the web for rate data - this crawl takes can take up to 15 seconds, sometimes more. It's not grabbing the rate data from the DB at the moment. Why do you say concurrency is not the optimal approach here? Do you have any other suggestions? – windfallisland Jul 08 '19 at 16:10
  • 2
    I agree with @fylooi 's [answer](https://stackoverflow.com/a/56930633/687142) . You need a queueing system and/or some off-line operations. – xlembouras Jul 09 '19 at 05:12
  • 1
    What the comments are saying is correct. Rethink the problem. Controllers should never make long running api calls because they will hang your server's connections while waiting for data. Instead that work should be delegated to a worker, probably through ActiveJob. Recommend [this](https://blog.codeship.com/how-to-use-rails-active-job/) as a tutorial. Has a section titled "Talking with external APIs". – Glyoko Jul 09 '19 at 05:23
  • @Glyoko I see, thanks for the suggestions! Definitely going to do some digging. – windfallisland Jul 10 '19 at 16:44

2 Answers2

4

The conventional Rails approach to this kind of problem would be to implement the long running request as a background job using ActiveJob.

Each rate request would trigger a separate job running in a worker process, and the job would update your job in DB (or Redis) upon completion.

You'd then have another controller which your JS polls to check status / results of individual jobs.

fylooi
  • 3,840
  • 14
  • 24
1

Unless you're a Rails expert, I would recommend against using concurrent-ruby gem together with Rails as it could make things quite complicated.

One common approach is already provided by @fylooi - using ActiveJob to handle background jobs and a JavaScript poller to detect when it's finished. You would have to setup the ActiveJob backend, which is a little bit of work.

Another solution would be to stay completely synchronous in Rails and do the parallelization in JavaScript instead. I.e., you would run multiple AJAX requests in parallel. (Max 6, but this might be enough for your case.)

claasz
  • 2,059
  • 1
  • 14
  • 16