2

I have 2 Ruby on Rails 4 apps on the same server (they do not - and should not - share database):

deploy@Ubuntu-1404-trusty-64-minimal:~/applications$ ls
app1  app2

How do I exchange data between app1 and app2?

My current implementation is unstable and not secure:

app1 requests app2 to update first and last name of the user with username bobby:

# app1
HTTParty.get("https://app2.com/update_full_name?username=bobby&first_name=Bob&last_name=Dylan")

app2 receives app1's request and processes:

# app2 app/controllers/some_controller.rb
def update_full_name
  user = User.find_or_create_by(username: params[:username])
  user.update_attributes(first_name: params[:first_name], last_name: params[:last_name])
end

I have read that ActiveResource has been removed from Rails 4. I never really understood ActiveResource anyway, so I will not explore it further and will prefer a different solution.

Cjoerg
  • 1,271
  • 3
  • 21
  • 63
  • Active resource is way togo for it or you can use 'her' gem. Another thing active resource not removed from rails 4, it is seperated as different gem. I have used active resource with rails 4. Do you have different databases in different application? or Do you have application like frontend and backend? – Chitrank Samaiya Jan 21 '15 at 05:21

5 Answers5

3

I had the same problem and explored every option: ActiveResource (which is deprecated), callbacks with my own homegrown API wrappers, queueing with Redis or RabbitMQ. Nothing was easy enough for my simple mind to implement. If Model1 in App1 will always update Model2 in App2, then best solution I've found with Rails is the Promiscuous gem

It's makes it pretty simple to run a pub/sub system that keeps data synched up between two ruby/rails apps. It works with both ActiveRecord & Mongoid. The documentation goes into more depth, but here are a few gotchas I found when trying to set it up using the Quick Start guide on the github page.

  1. Make sure you have an initializer file in both apps that connects to your shared RabbitMQ instance.
  2. If using ActiveRecord on the publisher side, you will need to create a new table (subscriber does not need this table to my knowledge):

    create_table :_promiscuous do |t|
      t.string    :batch
      t.timestamp :at, :default => :now
    end
    
  3. You will also need to add a column to every publisher and subscriber model

    # in App1 - publisher
    add_column :publisher_model, :_v, :integer, limit: 8, default: 1
    
    # in App2 - subscriber
    add_column :subscriber_model, :_v, :integer, limit: 8
    
  4. You can set the name of the published model. For example if I have a namespaced class Admin::User in App1, I can publish the attributes :as => 'AdminUser' and App2 has the model AdminUser it will listen correctly.

  5. If you've followed the instructions from the github page, included your mixins and set publishable/subscribable attributes, you will inevitably want to run it in production in which case your subscriber will need to run a worker. I use a pretty shameless ripoff of this Resque deploy script, my version for Promiscuous can be found here and it seems to work.

I'm finding more and more ways to use this setup. Gives me a lot more flexibility with sharing and managing my data. Good luck.

Sam M.
  • 286
  • 3
  • 8
1

So if i was you, i would look into a queueing system.

The nice thing about it is that it's asynchronous, so you don't really have to worry if both apps are up and running as soon as an update occurs.

I'm not really an expert on message queues, but RabbitMQ with something like RubyBunny could do a fine job if you ask me.

Anyone with experience on message queueing feel free to edit me.

jfornoff
  • 1,368
  • 9
  • 15
0

My suggestion is to keep an API and for that if you want a quick solution you may take a look at rails-api gem and their documentation.

For the data exchange you can use a persistent HTTP library like Typhoeus and queue the requests, the same way is explained in the documentation in the section Making Parallel Requests.

Concerning security, a simple "token" would be enough to make use you don't get forged requests. You could also keep a table of hosts to exchange data and they token.

Other option is to use the Publisher/Subscriber pattern, where I recommend this article to have a more in depth knowledge and this blog post.

Paulo Fidalgo
  • 21,709
  • 7
  • 99
  • 115
  • Hi. Is that really the best solution? It seems to be unnecessary to go through regular web API's instead of building internal communication between the apps. – Cjoerg Jan 16 '15 at 18:54
  • @ChristofferJoergensen The best solution will be your decision, apart from that you could also take a look at a PUB SUB solution. – Paulo Fidalgo Jan 16 '15 at 19:01
  • Could you maybe refer to an article that described how this would work between applications? A quick Google search points to redis pub/sub where the applications can post updates through a redis db. Or do I misunderstand? – Cjoerg Jan 16 '15 at 19:16
  • 1
    This is the way to do it. Any solution you come up with will need one add "publishing" changes and the other "subscribing" to them, you may as well do it with the HTTP you already have, without adding any extra infrastructure. – Xavier Shay Jan 19 '15 at 19:07
0

It seems that you are required to share something between these two applications, in your example, it is a URL. I understand the need for separate databases, but I'm going to assume you're okay with having some sort of 3rd shared resource.

I like Paulo's idea of using Redis here, but I think going into pub/sub and Typheous could be more complexity than is necessary. My suggestion is:

  1. Store update-able info in Redis
  2. Use cron to run a rake task to pull updates

#1

Assuming you setup Redis and redis-rb

When you update a model on app1, store the applicable changes in Redis:

### User just updated! ###
# Grab the attributes you want to update on other server
attributes_i_care_about = user.attributes.extract!(*%w( username first_name last_name ))

# Set a key for this user, future updates will overwrite, leaving only most recent
key_for_this_user = "user_updates:#{user.username}"

# Store it
@redis.hmset(key_for_this_user,*attributes_i_care_about)

#2

Setup cron to run as often as you like, I won't go into cron details here but the command should be pretty simple once you setup a Rake task. Something like: bundle exec rake user_updates:process

Where the rake task might look something like this:

namespace :user_updates do

  desc "Process user updates from other server"
  task process: :environment do

    @redis.keys("user_updates*").each do |key|
      updated_attributes = @redis.hgetall(key)
      user = User.find_or_create_by(username: updated_attributes["username"])
      user.update_attributes(updated_attributes)
      @redis.del(key) # get rid of the key after use
    end

  end

end

No internet connection required!

The Worker Ant
  • 119
  • 2
  • 7
  • Very interesting approach. Relying on a shared redis db sounds very smart! I will try this out. One question: You suggest to have a recurring cron job running to start the rake task. I need my communication to be as fast as possible, so I would need the cron job to run every second. I can read from a few sources [e.g. this one](http://askubuntu.com/questions/334662/cron-job-every-second) that a cron job runs at least a minute. Can you think of an alternative solution to the cron job? – Cjoerg Jan 19 '15 at 07:19
  • I was going for a basic approach that didn't increase requirements too much, but you could use this: https://github.com/homer6/frequent-cron. Or since you already have Redis in mind, Resque works well for this sort of thing and you could just move your rake task onto a `User.perform` method and then run a Resque worker. Granted this will take a little more setup work. https://github.com/resque/resque – The Worker Ant Jan 19 '15 at 12:08
  • 2
    1) You don't need an internet connection for an API based approach either, you can just connect on localhost. 2) If the apps are not allowed to share a database, why are they allowed to share a redis, which is just another database? Adding another piece of infrastructure here seems like needless complexity. – Xavier Shay Jan 19 '15 at 19:02
  • 1
    3) Using cron for regular tasks is a bad idea, particularly when they happen frequently. Without careful coding, you risk a stampeding herd where a new process is scheduled before the last one has finished. A `while true` loop here with a small sleep in it would be better. (Or better: you process manager should gracefully restart processes that exit 0, so you don't even need the loop yourself.) – Xavier Shay Jan 19 '15 at 19:05
  • @XavierShay, thanks for your comment. First, regarding the shared resource I believe that there is a big difference in using a shared redis to store commands between application, and then to share database with e.g. user info. So I will try out this approach. I do also believe though that it still remains an open question how app1 "starts" app2's processing tasks. Do you think I should start a separate question for this subject, or is it necessary to get it answered/settled within this open question? – Cjoerg Jan 19 '15 at 19:30
  • 1) Why not just use a different database on the server you already have, rather than introducing a new type of thing? 2) app2 just starts them on boot. If app1 isn't ready, it just keeps retrying. – Xavier Shay Jan 19 '15 at 19:31
  • Hi @XavierShay, I'm not quite following. You want to replace the shared redis db with a shared regular db (like mysql/mongodb), and then store commands inside this new shared db? Then how would app2 be notified by app1 that a new command has been put into the shared db? – Cjoerg Jan 19 '15 at 19:42
  • just poll it. Some databases have fancier ways to do this, but unless you're dealing with a serious amount of load you don't need to worry about it. Alternatively, you can POST to a web endpoint from app1 to app2 (though if you're doing this you can just include the changes in that POST!) – Xavier Shay Jan 19 '15 at 19:46
  • Right, doing POST notifications is not a viable solution, but to poll changes might be an interesting approach. Do you have a link to a gem or an article that can help me read more into this? I found the gem [mongo-watch](https://github.com/TorchlightSoftware/mongo-watch), but it's deprecated. – Cjoerg Jan 19 '15 at 19:57
0

My primary choice would be implement an api using ActiveResource. If you wouldn't use the API solution, you can use rake tasks with cron to exchange data between two Rails applications, like @TheWorkwerAnt suggested, but without redis.

The difference is:

  • Add a column to your records to indicate which of then are synchronized.
  • Make connections to your two databases in your rake tasks. Reference.

The advantage of this approach is that it don't rely on external systems.

Rodrigo
  • 5,435
  • 5
  • 42
  • 78