10

We need a Rails caching solution that works with a multi-dyno formation on Heroku. Specifically, we need worker dynos processing long-running tasks to write to a cache that our web dynos can read from.

Apparently the only way to implement a shared cache across dynos is by using Memcached. However I'm having trouble finding objective instructions on how to get this done. (The Heroku docs are written by Memcached add-on vendors like MemCachier that are promoting their product.)

My questions:

  • Is a 3rd party add-on necessary, or can Memcached be implemented directly on a dyno within the formation?
  • Is using Memcached via an outside service even practical? If the whole point of Memcached is high-performance in-memory access, doesn't the network latency of an external service negate that?
  • If using add-ons, is there a reason to choose MemCachier vs Memcached Cloud?
Yarin
  • 173,523
  • 149
  • 402
  • 512

2 Answers2

6

Due diligence - I work at Redis Labs, the company that provides the Memcached Cloud addon.

  • I'm not familiar with anyone running any datastore/database directly off a dyno - Heroku's and 3rd-party addons are available exactly for that.
  • Yes, using a remote Memcached is the common way to go with a web app that needs to scale to multiple dynos. Despite not being colocated on the same server, you'll still get the responses from you Memcached in <1 msec.
  • Choose the addon that gives you most value for your money - not only in terms of RAM per $ but also with regards to robustness and functionality - refer to this comparison for more information.
Itamar Haber
  • 47,336
  • 7
  • 91
  • 117
  • Thanks for this- we just installed the memcached cloud add-on, implementation turned out to be no problem- we just tweaked the instructions for Rails 3 slightly ([See my answer](http://stackoverflow.com/a/21665167/165673)). Looking good! – Yarin Feb 09 '14 at 21:20
  • 1
    How can I use RedisClound for both without running out of memory? I want my Sidekiq jobs to persist, yet I want my fragments to auto-expire themselves after a memory threshold has been reached. – Mohamad Sep 22 '15 at 14:17
  • I'm unsure I'm following you - please open a new question. – Itamar Haber Sep 22 '15 at 14:58
  • @ItamarHaber I believe I have asked a similar question to what Mohamad is asking here: http://stackoverflow.com/q/32833021/1299792 – Marklar Sep 28 '15 at 22:54
5

(@ItamarHaber answered the question and sold me on memcached cloud. Just wanted to show exactly how we implemented it)

Using Memcached Cloud add-on with Rails 4 : (derived from instructions for Rails 3)

Add Dalli to your Gemfile:

gem 'dalli'

Set the cache_store in config/environments/production.rb:

  # NOTE: ENV vars aren't available during slug comiplation, so must check if they exist:
  if ENV["MEMCACHEDCLOUD_SERVERS"]
    config.cache_store = :mem_cache_store, ENV["MEMCACHEDCLOUD_SERVERS"].split(','), { :username => ENV["MEMCACHEDCLOUD_USERNAME"], :password => ENV["MEMCACHEDCLOUD_PASSWORD"] }
  end

UPDATE:

After a little more research we realized that Redis could provide us with all the benefits of Memcached caching plus a slew of other features. Redis Labs, the makers of the Memcached Cloud add on, also offer the Redis Cloud add on, which is just as easy to use:

Using Redis Cloud add-on with Rails 4 :

Add Redis to your Gemfile:

gem 'redis-rails'

Set the cache_store in config/environments/production.rb:

  # NOTE: ENV vars aren't available during slug comiplation, so must check if they exist:
  if ENV["REDISCLOUD_URL"]
    config.cache_store = :redis_store, ENV["REDISCLOUD_URL"], { expires_in: 90.minutes }
  end
Yarin
  • 173,523
  • 149
  • 402
  • 512
  • Yep - we've actually got that input recently from another user (re. conditional initialzer) and we've been debating internally what's the best way to work around this - your solution is basically what we've arrived at so we'll be updating the relevant docs shortly. BTW, another approach one could consider is using Heroku's https://devcenter.heroku.com/articles/labs-user-env-compile but that's labs-only and against their own best practices as :) – Itamar Haber Feb 09 '14 at 21:56
  • 3
    How does speed of `:redis_store` compare with `:mem_cache_store`? – Dogweather Jul 01 '15 at 18:26
  • 1
    @Yarin what if you are using Sidekiq, then you will not want to expire the cache... how does one use Redis for both (persistent data like Sidekiq jobs, and transient like fragments) without eventually running out of memory? – Mohamad Sep 22 '15 at 14:14