0

It's beens ome time since I've been trying to figure out the really easy way.

I am using gitlab CI/CD and want to move the built data from there to AWS EC2. Problem is i found 2 ways which both are really bad ideas.

  1. building project on gitlab ci/cd, then ssh into the AWS, pull the project from there again, and run npm scripts. This is really wrong and I won't go into details why.
  1. I saw the following: How to deploy with Gitlab-Ci to EC2 using AWS CodeDeploy/CodePipeline/S3 , but it's so big and complex.

Isn't there any easier way to copy built files from gitlab ci/cd to AWS EC2 ?

Nika Kurashvili
  • 6,006
  • 8
  • 57
  • 123

1 Answers1

2

I use Gitlab as well, and what has worked for me is configuring my runners on EC2 instances. A few options come to mind:

  1. I'd suggest managing your own runners (vs. shared runners) and giving them permissions to drop built files in S3 and have your instances pick from there. You could trigger SSM commands from the runner targeting your instances (preferably by tags) and they'll download the built files.
  2. You could also look into S3 notifications. I've used them to trigger Lambda functions on object uploads: it's pretty fast and offers retry mechanisms. The Lambda could then push SSM commands to instances. https://docs.aws.amazon.com/AmazonS3/latest/dev/NotificationHowTo.html
peter n
  • 1,210
  • 13
  • 18
  • What do I need to install on AWS so that instance picks files from S3? – Nika Kurashvili Jul 09 '20 at 23:41
  • You'll need to install the SSM agent on the instances, some AMI's have it already pre-installed. The agent will poll for commands to the Systems Manager service. More info here: https://docs.aws.amazon.com/systems-manager/latest/userguide/ssm-agent.html. The call is also async, so you have to poll for script results. The instance also needs GetObject* permissions to the S3 bucket. – peter n Jul 10 '20 at 02:11