2

I have a blog page I developed using rails 5.1. Everything works just fine, except that after I create a post in production and attach an image, the image stops showing after a while (say 30 minutes). I scouted around the internet looking for solutions and saw this which suggest the problem has to do with Heroku wiping the directory after every app restart. One solution offered is to host your images on a service like Amazon S3.

I have however set S3 up and the images are being sent to the bucket shown below: enter image description here

But still, the blog post images still disappear. I need help as I cannot figure out what I am missing. Here are the relevant codes:

shrine.rb:

require "shrine"
require "shrine/storage/s3"
s3_options = {
    access_key_id:      ENV['S3_KEY'],
    secret_access_key:  ENV['S3_SECRET'],
    region:             ENV['S3_REGION'],
    bucket:             ENV['S3_BUCKET'],
}

if Rails.env.development?
  require "shrine/storage/file_system"
  Shrine.storages = {
    cache: Shrine::Storage::FileSystem.new("public", prefix: "uploads/cache"), # temporary
    store: Shrine::Storage::FileSystem.new("public", prefix: "uploads/store")  # permanent
  }
elsif Rails.env.test?
  require 'shrine/storage/memory'
  Shrine.storages = {
    cache: Shrine::Storage::Memory.new,
    store: Shrine::Storage::Memory.new
  }
else
  require "shrine/storage/s3"

  Shrine.storages = {
    cache: Shrine::Storage::S3.new(prefix: "cache", **s3_options),
    store: Shrine::Storage::S3.new(prefix: "store", **s3_options)
  }
end
Shrine.plugin :activerecord # or :activerecord
Shrine.plugin :cached_attachment_data # for retaining the cached file across form redisplays

gemfile:

....................................
# A rich text editor for everyday writing
gem 'trix', '~> 0.11.1'
# a toolkit for file attachments in Ruby applications
gem 'shrine', '~> 2.11'
# Tag a single model on several contexts, such as skills, interests, and awards
gem 'acts-as-taggable-on', '~> 6.0'
# frameworks for multiple-provider authentication.
gem 'omniauth-facebook'
gem 'omniauth-github'
# Simple Rails app key configuration
gem "figaro"
..............................

I use Figaro gem to mask the env files. They are fine since the S3 bucket responds, plus I already have OmniAuth up and running on the blog.

Here is the error it shows on the chrome console for the image:

enter image description here

I really need help to get this blog up and running. Thank you for your time.

Chidozie Nnachor
  • 872
  • 10
  • 21
  • Images get deleted from AWS S3 as well? If you've set up automatic expiring of `cache/*` images, I would recommend you reviewing it again to check whether you've targeted only the `cache/*` directory. You can also see if you can enable logging for the S3 bucket, and see when and what is making DELETE requests to your bucket. This is very strange, Shrine doesn't just automatically delete files from the storage, only when you delete records, and that happens immediately. – Janko Jul 03 '18 at 01:41
  • @janko-m, The images don't get deleted on AWS at all. They just stop loading on the blog page. Shows a small icon-view with an "x" mark to delete it just like you would see in a corrupted file. Suffice to add that I use trix editor for drag-and-drop while composing a blog. Everything works just fine on the local machine. It breaks only when in production. – Chidozie Nnachor Jul 03 '18 at 10:12
  • 1
    Maybe it has something to do with the fact that S3 URLs that Shrine generates are expiring by default? Are they being cached somehow in your application? – Janko Jul 03 '18 at 14:02
  • @janko-m, that could be a possible reason. If that is the case, do you know how I could go about stopping that from happening? I am not really an expert in these stuffs. But I am willing to learn. Only started coding this January. Don't be put off by my questions ;) – Chidozie Nnachor Jul 04 '18 at 11:24

1 Answers1

4

Shrine generates expiring S3 URLs by default, so it's possible that the generated URLs are somehow getting cached, and then the images become unavailable once the URL has expired.

As a workaround, you can make S3 uploads public and generate public URLs instead. You can do that by telling the S3 storage to make uploads public (note that this will only affect new uploads, existing uploads will remain private so you'd have to make them public in another way), and to generate public URLs by default, by updating the initializer:

# ...

require "shrine/storage/s3"

Shrine.storages = {
  cache: Shrine::Storage::S3.new(prefix: "cache", upload_options: { acl: "public-read" }, **s3_options),
  store: Shrine::Storage::S3.new(prefix: "store", upload_options: { acl: "public-read" }, **s3_options)
}

# ...

Shrine.plugin :default_url_options, cache: { public: true }, store: { public: true }

# ...
Janko
  • 8,985
  • 7
  • 34
  • 51
  • I keep getting this error, config/initializers/shrine.rb:12:in `': uninitialized constant Shrine::Storage::S3 (NameError) from /home/odogwudozilla/.rvm/gems/ruby-2.4.1/gems/railties-5.1.6/lib/rails/engine.rb:655:in `block in load_config_initializer' from /home/odogwudozilla/.rvm/gems/ruby-2.4.1/gems/activesupport-5.1.6/lib/active_support/notifications.rb:168:in `instrument' from /home/odogwudozilla/.rvm/gems/ruby-2.4.1/gems/railties-5.1.6/lib/rails/engine.rb:654:in `load_config_initializer'..... – Chidozie Nnachor Jul 06 '18 at 09:32
  • Apologies on the last comment. I inadvertently commented out Stupid me. – Chidozie Nnachor Jul 06 '18 at 09:39
  • 2
    Thanks @janko-m. Your solution worked for me like a charm. Just this evening I realised you are the owner of Shrine gem. Occurred to me once I kept bumping into the same "janko" on several of my searches involving shrine all over the ineternet. Your enthusiastic engagement is very commendable – Chidozie Nnachor Jul 06 '18 at 20:35
  • When using that option I get AccessDenied error. What should I change in AWS so it's allowed to upload public-read objects? – Andres Espinosa Mar 29 '21 at 16:32