1

I am invoking an AWS Batch job from lambda using boto3 client.I am using boto3 submit_job api for job submission.While submitting the job I have configure environment variables.But python script which is running inside AWS Batch(With in docker container)is not printing my custom env variables when I observe in cloudwatch logs.But it is printing default AWS Batch env variables.

I am using os.environ to print env variables in Python

response = client.submit_job(
    jobName='securejobname',
    jobQueue='securejobqueue',
    jobDefinition='securejobdefinition',
    parameters={
        'filedata': 'filetestdata'
    },
    containerOverrides={
        'environment': [
            {
                'name': 'myfileattribute',
                'value': 'simplefile.txt'
            },
        ],
    }
)

Do we need to configure any thing specific?

Test Mail
  • 169
  • 3
  • 15

1 Answers1

0

Take a typical case in which the python script is launched from within a bash script running inside the batch container as follows:

#!/bin/bash
# some batch steps
...
# invoke python script that accesses the batch job environment variables $name and $value
python myscript.py

The above will fail because the python script cannot access the batch job environment variables. This is because the python script itself is launched in a subprocess of the main process, so for the environment variables to be visible to it, they must be export-ed.

The fix is to add in some export statements as follows:

#!/bin/bash
# some batch steps
...
# export batch job environment variables so that python has access to them
export name="$name"
export value="$value"

# invoke python script that accesses the batch job environment variables $name and $value
python myscript.py

Python will now happily access the environment variables. Also see the following answer related to export: Defining a variable with or without export

elukem
  • 1,068
  • 10
  • 11