0

I have a file hosted on an s3 bucket that has the following format:

var1=value1
var2=value2
var3=value3

I wish to create a bash script on my Linux box that when executed, set environment variables from the remote file. SO far, I have tried the following:

#!/bin/sh

export $(aws s3 cp s3://secret-bucket/file.txt - | sed -e /^$/d -e /^#/d | xargs)
#!/bin/sh

eval $(aws s3 cp s3://secret-bucket/file.txt - | sed 's/^/export /')

But none of them seems to work because when I execute printenv the variables I need do not show. Any help would be very much appreciated.

realnsleo
  • 709
  • 2
  • 12
  • 29
  • Are these scripts being run through User Data, or are you invoking them yourself after login? – John Rotenstein Sep 29 '20 at 12:33
  • I am loading these variables after building a Docker container. I hope that helps? – realnsleo Sep 29 '20 at 12:38
  • 3
    The `#!/bin/sh` header looks like you are executing this as a separate script, meaning the assignments will be evaluated, then forgotten when the script exits. (Also, don't use Bash syntax in `sh` scripts; perhaps see also https://stackoverflow.com/questions/5725296/difference-between-sh-and-bash) – tripleee Sep 29 '20 at 13:12
  • 1
    If you could ssh to that host, try following `sshfs user@host:/patho/to/file ~/tmpdir; source ~/tmpdir/file_with_vars` – Ivan Sep 29 '20 at 13:27
  • @tripleee thank you. The link was very helpful in getting me to understand. I will dig in more to find something that works mainly for my use case. – realnsleo Sep 29 '20 at 15:51

0 Answers0