1

First off, I'm pretty new to AWS and it took me a lot of trial and error to get my lambda function to execute my python script which sit on an ec2 instance.

If I run my code manually through command line in my ec2 instance, the code works perfectly, it call the requested api and saves down the data.

If I call my script through a lambda function using ssh, it stops executing at the api call, the lamda returns that everything ran, but it didn't, I get no output messages returned to say there was an exception, nothing in the cloudwatch log either. I know it starts to execute my code, because if I put print statments before the api calls, I see them returned in the cloudwatch log.

Any ideas to help out a noob.

Here is my lambda code:


import time
import boto3
import json
import paramiko


def lambda_handler(event, context):

    ec2 = boto3.resource('ec2', region_name='eu-west-2')
    instance_id = 'removed_id'

    instance = ec2.Instance(instance_id)

    # Start the instance
    instance.start()
    
    s3_client = boto3.client('s3')


    # Download private key file from secure S3 bucket
    # and save it inside /tmp/ folder of lambda event
    s3_client.download_file('removed_bucket', 'SSEC2.pem',
                            '/tmp/SSEC2.pem')

    # Allowing few seconds for the download to complete
    time.sleep(2)

    # Giving some time to start the instance completely
    time.sleep(60)

    ssh = paramiko.SSHClient()
    ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
    privkey = paramiko.RSAKey.from_private_key_file('/tmp/SSEC2.pem')
    # username is most likely 'ec2-user' or 'root' or 'ubuntu'
    # depending upon your ec2 AMI
    ssh.connect(
        instance.public_dns_name, username='ec2-user', pkey=privkey
    )
    print('Executing')
    stdin, stdout, stderr = ssh.exec_command(
       '/home/ec2-user/miniconda3/bin/python /home/ec2-user/api-calls/main.py')
    stdin.flush()
    data = stdout.read().splitlines()
    for line in data:
        print(line)

    ssh.close()

    # Stop the instance
   # instance.stop()
    
    return {
        'statusCode': 200,
        'body': json.dumps('Execution successful ' )
    }

edit : okay, slight update, it's not falling over on the api call, it's actually stopping when it tries to open a config file a write, which is stored in "config/config.json". Now obviously this works in the ec2 environment when I'm executing manually, so this must have something to do with enviroment variables in ec2 not being the same if the job is triggered from elsewhere?? here is the exact code :

@staticmethod
    def get_config():
        with open("config/config.json", "r") as read_file:
            data = json.load(read_file)
        return data
Andy Park
  • 11
  • 2
  • Does this answer your question: https://stackoverflow.com/questions/3215727/paramiko-ssh-exec-commandshell-script-returns-before-completion – Hussain Bohra Feb 18 '21 at 21:55
  • Thanks Hussain, The code still quits as soon as it tries to make the api call in python. interestingly I do get an error code of 1 returned, by using your suggestion, however that is all I get, no actual errors to screen, weird. – Andy Park Feb 19 '21 at 21:18

1 Answers1

0

problem solved. I need to use the full path names when executing the code remotely.

with open("/home/ec2-user/api-calls/config/config.json", "r") as read_file :
'''
Andy Park
  • 11
  • 2