0

In a controller in my Rails app I had a function that parses my S3 bucket and selects images. It's causing page load speeds to go slow but I like being able to loop through the bucket without having all the URLs hard coded.

Here is what I have:

@bucket = S3_BUCKET
@images = []

@bucket.objects.each do |file|
  if file.key.include?("inspiration")
    @images << { url: file.public_url, key: file.key, type: 'file'}
  end
end

Is there another way to accomplish this so page load speeds don't suffer?

Jeremy Thomas
  • 6,240
  • 9
  • 47
  • 92
  • Any remote API call you make is going to incur some speed loss for the end user due to network latency and remote host processing speed. How much speed loss are we talking here? I'd be surprised if it was anything more than 500ms, and if it was, I'd start looking more at the rest of the code for other potential issues. You can mitigate that speed loss by pre-fetching the images from the s3 bucket and store them in a cache. You can trigger that in a cron job that runs every n minutes, but you can't get away from that fact that talking to another service takes time. – Billy Kimble Aug 30 '18 at 14:11

2 Answers2

0

As it turns out there were many more files than expected and the loop took a long time to complete. I changed the code to:

@images = @bucket.objects({prefix: 'inspiration')

and the response was much faster.

Jeremy Thomas
  • 6,240
  • 9
  • 47
  • 92
0

Since you really can't regulate the speed at which you access your s3 bucket I would suggest setting up a CDN(Content delivery network) on Amazons Cloudfont. Please take a look at this article Written by Brandon Hikert about implementing a CDN

 https://brandonhilkert.com/blog/setting-up-a-cloudfront-cdn-for-rails/

Side note - If you would like a free CDN option I would use

 https://cloudinary.com/pricing

Referencing when to use a CDN over s3

 https://stackoverflow.com/questions/3327425/when-to-use-amazon-cloudfront-or-s3
Snowman08
  • 425
  • 1
  • 4
  • 16