0

I am running a docker image of Python code in two different environments.

  1. In Windows via cmd docker run --rm --name test -p 9000:8080 estimate_variance
  2. In Linux via terminal docker run --rm --name test -p 9000:8080 estimate_variance

The code itself uses aws S3 bucket and for some reason when I am using (1) and authenticate aws sso login it gives me an error of credentials so I had to supply aws_access_key_id aws_secret_access_key aws_session_token when I've created the client s3 = boto3.client('s3',aws_access_key_id =....,aws_secret_access_key = ..., aws_session_token=..

while in (2) the authentication works fine.

Is there any way to detect within the code which environment I am running?

Something like

import boto3
env = detect_env()
if env =='linux':
    s3 = boto3.client('s3')
elif env =='win':
    s3 = boto3.client('s3',aws_access_key_id =....,aws_secret_access_key = ..., 
    aws_session_token=..)
else:
    pass
Benny K
  • 1,957
  • 18
  • 33

1 Answers1

0

The workaround I found was to set environment variable FROM_WINDOWS only in (1) like:

docker run --env FROM_WINDOWS=1 --rm --name test -p 9000:8080 estimate_variance

and then check it in the code

import os
def detect_env:
    if os.getenv('FROM_WINDOWS') is not None:
       return 'win'
    else:
       return 'linux'
Benny K
  • 1,957
  • 18
  • 33