0

I have a S3 Bucket which contains index.html, index.js, and index.css; a static website.

I am using CloudFront to distribute this S3 Bucket.

I am using a combination of CodePipeline and CodeBuild to update the files in my S3 bucket.

With this setup - it is possible for people to be seeing old versions of my website. One thing I can do is set up a Lambda which invalidates the cached S3 Bucket in CloudFront when CodePipeline and CodeBuild updates the files in the S3 Bucket.

The problem with this is that it is expensive so I am looking for an alternative.

One solution that I've thought of is to introduce a directory structure in my S3 bucket like so:

v1
|
 _ index.html
 _ index.js
 _ index.css
v2
|
 _ index.html
 _ index.js
 _ index.css
...

With this setup, is it possible to make CloudFront point to the latest version of my website rather? This would be cheaper than cache invalidation.

Ogen
  • 6,499
  • 7
  • 58
  • 124
  • https://stackoverflow.com/a/10622078/2231632 - you might want to set TTL to 0 and and utilise the proper http header semantics to see if that can help solve your problem. – Praba Jun 26 '18 at 08:10

1 Answers1

0

This is how I solved it.

  1. Cache index.html only for 5 minutes
  2. Cache rest of the files for more than a year.

Build Process:

  1. Create a new hash for each build
  2. Follow the below filename convention
  3. Copy index.html and all build assets to the same folder, no versioning needed

This way index.html will get refreshed from CloudFront every 5 minutes and rest of the assets will be served as per the request.

index.hash.js
index.hash.css

replace hash with the generated hash.

Hope it helps.

Kannaiyan
  • 12,554
  • 3
  • 44
  • 83