0

I need to upload the updated files into multiple ec2 instace which is under single LB. My problem is I missed some ec2 instance and it broke my webpage. Is there any tool available to upload the multiple files to multiple EC2 windows server in a single click.

I will update my files weekly or some times daily. I checked with Elastic beanstalk , Amazon Code Deploy and Amazon EFS. But the are hard to use. Anyone please help

Rahmathullah
  • 107
  • 13

4 Answers4

4

I will suggest use AWS S3 and AWS CLI. What you can do is install AWS CLI on all the EC2 instance. Create a Bucket in AWS S3.

Start a Cron Job on each EC2 instance with below syntax.

aws s3 sync s3://bucket-name/folder-on-bucket /path/to/local/folder 

So what will happen is when you upload new images to the S3 bucket all images will automatically sync with all the EC2 instances behind your load balancer. And also AWS s3 will be central directory where you upload and delete images.

Piyush Patil
  • 14,512
  • 6
  • 35
  • 54
2

You could leverage the AWS CLI, you could run something like

aws elb describe-load-balancers --load-balancer-name <name_of_your_lb> --query LoadBalancerDescriptions[].Instances --output text |\ 
xargs -I {} aws ec2 describe-instances --instance-id {} --query Reservations[].Instances[].PublicIpAddress |\
xargs -I {} scp <name_of_your_file> <your_username>@{}:/some/remote/directory 

basically it goes like this:

  1. find out all the ec2 instances connected to your Load Balancer
  2. for each of the ec2 instances, find out the PublicIPAddress (supposedly you have since you can connect to them through scp)
  3. run scp command to copy 1 files somewhere on the ec2 server

you can copy also copy folder if you need to push many files , it might be easier

Amazon ElasticFileSystem would probably now be the easiest option, you would create your file system and attach it to all your ec2 instances that are attached to the Load Balancer, and when you transfer files to the EFS it will be available to all the ec2 instances where the EFS is attached (the setup to create EFS and mount it to your ec2 instances has to be done once only)

Community
  • 1
  • 1
Frederic Henri
  • 51,761
  • 10
  • 113
  • 139
0

Create a script containing some robocopy commands and run it when you want to update the files on your servers. Something like this:

robocopy Source Destination1 files
robocopy Source Destination2 files

You will also need to share the folder you want to copy to with the user on your machine.

Mahdi
  • 3,199
  • 2
  • 25
  • 35
0

I had an application load balancer (alb), so I had to build on @FredricHenri's answer


EC2_PUBLIC_IPS=`aws elbv2 --profile mfa describe-load-balancers --names c360-infra-everest-dev-lb --query 'LoadBalancers[].LoadBalancerArn' --output text | xargs -n 1 -I {} aws elbv2 --profile mfa describe-target-groups --load-balancer-arn {} --query 'TargetGroups[].TargetGroupArn' --output text | xargs -n 1 -I {} aws elbv2 --profile mfa describe-target-health --target-group-arn {} --query 'TargetHealthDescriptions[*].Target.Id' --output text | xargs -n 1 -I {} aws ec2 --profile mfa describe-instances --instance-id {} --query 'Reservations[].Instances[].PublicIpAddress' --output text` 
echo $EC2_PUBLIC_IPS
echo ${EC2_PUBLIC_IPS} | xargs -n 1 -I {} scp -i ${EC2_SSH_KEY_FILE} ../swateek.txt ubuntu@{}:/home/ubuntu/

Points to Note

  1. I have used an AWS profile called "MFA", this is optional
  2. The other environment variables EC2_SSH_KEY_FILE is the name of the .pem file used to access the EC2 instance.
swateek
  • 6,735
  • 8
  • 34
  • 48