5

I have a react app(SPA) that is deployed on S3 and publicly accessible. After each build, I have to manually upload the index.html, and other static assets to S3. Is there any ways to automate this process ?

I did an exhaustive search on CD(continuous deployment) to S3, here is a SO question about the same.

I am aggregating all the information from my exhaustive research. I have written the below answer, which contains various methods to achieve this.

Community
  • 1
  • 1
Lakshman Diwaakar
  • 7,207
  • 6
  • 47
  • 81

2 Answers2

7

There are many ways to do the automation of S3 deployment. Here are the things I gathered:

1. AWS SNS and Lambda:

This process is useful, if you want to trigger any AWS services on github push.So, here is the process:

  1. The github push triggers a message to SNS.
  2. The Lambda which is subscribed to the SNS topic is invoked.
  3. Inside the Lambda, I clone the github repository.
  4. Use AWS's S3 SDK to upload the build or dist directory to your S3 bucket. Here is a high level architecture of the above process:

Automation using SNS & Lambda

The downside of this approach is cloning large repos takes time and Lambdas are billed per second. So, this may become expensive for large repos.

2. Travis:

Travis is known for its CI(continuous integration) library. A .travis.yml is essential for the integration process.

If you want to make some tests after the build and then on success, upload the files to S3. Then this approach will be the best way. Travis is free for open source projects.

The downside is, I could not find a way to isolate a directory from the repo and upload that specific directory alone.

3. AWS cli:

This is the cheapest and best way to upload the files to S3. I used this approach. I got this information from this medium post.

Usually in react apps the build scripts are triggered by the npm or yarn written as scripts in the package.json. Here is the command for uploading the files to S3:

aws s3 sync build/ s3://<bucket-name>

I added this script as part of the build scripts in package.json. This was very handy and thus automated the manual process of uploading the files to S3.

This answer is based on my perspective. If anything is incorrect or If I had missed something, please feel free to comment and I will add it to the answer.

Lakshman Diwaakar
  • 7,207
  • 6
  • 47
  • 81
  • 2
    Agree with AWS CLI approach. Lambda times out after a maximum of 300 seconds ... so, it is also not optimal for sites involving larger assets. – John Oct 02 '17 at 22:21
2

I love @lakshman's answer, but it won't work for private/on-prem bitbucket repos and some other scenarios we run into with the commercial world.

A similar idea which is bitbucket-friendly (and more) is to have the code cloned into CodeCommit then trigger a CodePipeline which includes a CodeBuild step.

CodeBuild can push to S3. As a bonus, CodeBuild can run tests and include additional build steps as needed (migrations, etc). Bitbucket has mirror hook which allows CodeCommit to clone Bitbucket repos. I think this is the mirror hook link, but double check yourself as there are multiple Bitbucket plugins/extensions including the string mirror in the name.

This SO question discusses a different Bitbucket hook and also discusses cloning GitLab and JGit. Again, once the code is in CodeCommit then CodePipeline can take it from there.

Instead of using a Bitbucket hook you can also use a Bitbucket pipeline.

John Vandivier
  • 2,158
  • 1
  • 17
  • 23