3

I read this question before How to SSH and run commands in EC2 using boto3?. Many answers just said users don't have to use ssh to connect to EC2 and run command. However, I still don't have a clue how to run a python script by boto3. In boto2, this is a function run_instances which user can pass their script into EC2 node and run it, just like the code list as below

def run(self, **kwargs):
    ec2 = boto.connect_ec2(settings.PDF_AWS_KEY, settings.PDF_AWS_SECRET)
    sqs = boto.connect_sqs(settings.PDF_AWS_KEY, settings.PDF_AWS_SECRET)

    queue = sqs.create_queue(REQUEST_QUEUE)
    num = queue.count()
    launched = 0
    icount = 0

    reservations = ec2.get_all_instances()
    for reservation in reservations:
        for instance in reservation.instances:
            if instance.state == "running" and instance.image_id == AMI_ID:
                icount += 1
    to_boot = min(num - icount, MAX_INSTANCES)

    if to_boot > 0:
        startup = BOOTSTRAP_SCRIPT % {
            'KEY': settings.PDF_AWS_KEY,
            'SECRET': settings.PDF_AWS_SECRET,
            'RESPONSE_QUEUE': RESPONSE_QUEUE,
            'REQUEST_QUEUE': REQUEST_QUEUE}
        r = ec2.run_instances(
            image_id=AMI_ID,
            min_count=to_boot,
            max_count=to_boot,
            key_name=KEYPAIR,
            security_groups=SECURITY_GROUPS,
            user_data=startup)
        launched = len(r.instances)
    return launched

BOOTSTRAP_SCRIPT is a python script

I write some code with boto3:

# -*- coding: utf-8 -*-

SCRIPT_TORUN = """

import boto3

bucket = random_str()
image_name = random_str()
s3 = boto3.client('s3')
Somedata = 'hello,update'

upload_path = 'test/' + image_name
s3.put_object(Body=Somedata, Bucket='cloudcomputing.assignment.storage', Key=upload_path)

"""

import boto3
running_instance = []
ec2 = boto3.resource('ec2')

for instance in ec2.instances.all():
    if instance.state['Name'] == 'running':         # Choose running instance and save their instance_id
        running_instance.append(instance.id)
        print instance.id, instance.state
print running_instance

I can get the details of running instances, can anybody tell me is there a function like run_instances in boto3, which I can use to run the script SCRIPT_TORUN in one of my running EC2 instances.

Coding_Rabbit
  • 1,287
  • 3
  • 22
  • 44

4 Answers4

3

See: Boto3 run_instances

The parameter you are looking for is: UserData='string'

UserData (string) --

The user data to make available to the instance. For more information, see Running Commands on Your Linux Instance at Launch (Linux) and Adding User Data (Windows). If you are using a command line tool, base64-encoding is performed for you, and you can load the text from a file. Otherwise, you must provide base64-encoded text.

This value will be base64 encoded automatically. Do not base64 encode this value prior to performing the operation.

Community
  • 1
  • 1
helloV
  • 50,176
  • 7
  • 137
  • 145
  • Does that mean I can run a script once I get a running instance without `SSH`? Can I specify one instance to run the script because I didn't find a parameter like 'instance_id' in `run_instances` function. – Coding_Rabbit Oct 26 '17 at 13:42
1

Here is how to do it using another python lib called paramiko

import paramiko

user_name='ubuntu'
instance_id='i-08h873123123' #just an example
pem_addr='/Users/folder1/.ssh/jack-aws.pem' # folder path to aws instance key
aws_region='us-east-1' 


instances = ec2.instances.filter(Filters=[{'Name': 'instance-state-name', 'Values': ['running']}])


for instance in instances:
    if (instance.id==instance_id):
        p2_instance=instance
        break;



ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
privkey = paramiko.RSAKey.from_private_key_file(pem_addr)
ssh.connect(p2_instance.public_dns_name,username=user_name,pkey=privkey)


cmd_to_run='dropbox start && source /home/ubuntu/anaconda3/bin/activate py36 && cd /home/ubuntu/xx/yy/ && python3 func1.py' #you can seperate two shell commands by && or ;
stdin4, stdout4, stderr4 = ssh.exec_command(cmd_to_run,timeout=None, get_pty=False)
ssh.close()
Aseem
  • 5,848
  • 7
  • 45
  • 69
0

If you want to run the script once and only once, specifically at EC2 launch time, then you can provide the script in userdata when you call run_instances.

If, however, you want to run a script on one (or more) EC2 instances on an ad hoc basis, then you should look at either EC2 Systems Manager (the Run Command) or something like Fabric (example).

jarmod
  • 71,565
  • 16
  • 115
  • 122
  • Can I specify one instance to run the script because I didn't find a parameter like 'instance_id' in run_instances function? – Coding_Rabbit Oct 27 '17 at 07:27
  • The run_instances function doesn't take instance IDs as input. The function launches *new* EC2 instances. It *returns* instance IDs. – jarmod Oct 27 '17 at 11:43
  • What if I want to use an existed instance to run the script instead of creating a new instance? – Coding_Rabbit Oct 27 '17 at 15:07
  • As indicated earlier, use EC2 SSM or something like Fabric. – jarmod Oct 27 '17 at 15:12
  • I noticed that the parameter in `send_command` only accept a string, what if I want to transfer a multiple lines script to the instance to run? I looked through the official documents and didn't find there's a parameter like `UserData` in `send_command`. – Coding_Rabbit Oct 29 '17 at 14:13
  • Push the multi line script onto the EC2 instance (via scp or otherwise) or pull the script into the EC2 instance (perhaps from S3), and then use SSM to invoke that script. You could also use a simple SSM one-liner to retrieve a script from S3, chmod +x the script, and then exec it as a general purpose method for executing *any* script. – jarmod Oct 29 '17 at 14:49
  • I write a command like `aws s3 cp s3://cloudcomputing.assignment.storage/Script/CCStorage.py CCStorage.py` and run this command on my own computer successfully. However, when I run this command by `send_command` and it failed. I checked the log in S3 and it said `Access denied`. However, I have already added `Full S3 Access` permission to the user. Do you know if there's other reasons can cause this problem? – Coding_Rabbit Oct 29 '17 at 19:00
  • Are you sure that the role you supplied when launching the EC2 instance has the necessary S3 permissions? You may want to SSH onto the instance to debug this further (e.g. by manually running aws s3 cp ...) – jarmod Oct 29 '17 at 19:34
0

Here is how I have done

import boto3
import botocore
import boto
import paramiko

ec2 = boto3.resource('ec2')

instances = ec2.instances.filter(
Filters=[{'Name': 'instance-state-name', 'Values': ['running']}])
i = 0
for instance in instances:
  print(instance.id, instance.instance_type)
  i+= 1
x = int(input("Enter your choice: "))
try:
  ssh = paramiko.SSHClient()
  ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
  privkey = paramiko.RSAKey.from_private_key_file('address to .pem key')
  ssh.connect(instance.public_dns_name,username='ec2-user',pkey=privkey)
  stdin, stdout, stderr = ssh.exec_command('python input_x.py')
  stdin.flush()
  data = stdout.read().splitlines()
for line in data:
    x = line.decode()
    #print(line.decode())
    print(x,i)
    ssh.close()
except:
  --------

for credentials I have added AWSCLI package, the open terminal and run command

aws configure

and enter the details, which will be saved and automatically read by boto3 from .aws folder.

  • Doesnt work when i try to import any module. eg: import numpy . Gives error saying numpy doesnt exist, but if i run the same code from ec2 terminal, it works fine – Aseem Jun 13 '18 at 13:44