109

I have a script that looks something like this:

export foo=/tmp/foo                                          
export bar=/tmp/bar

Every time I build I run 'source init_env' (where init_env is the above script) to set up some variables.

To accomplish the same in Python I had this code running,

reg = re.compile('export (?P<name>\w+)(\=(?P<value>.+))*')
for line in open(file):
    m = reg.match(line)
    if m:
        name = m.group('name')
        value = ''
        if m.group('value'):
            value = m.group('value')
        os.putenv(name, value)

But then someone decided it would be nice to add a line like the following to the init_env file:

export PATH="/foo/bar:/bar/foo:$PATH"     

Obviously my Python script fell apart. I could modify the Python script to handle this line, but then it'll just break later on when someone comes up with a new feature to use in the init_env file.

The question is if there is an easy way to run a Bash command and let it modify my os.environ?

Mogsdad
  • 44,709
  • 21
  • 151
  • 275
getekha
  • 2,495
  • 3
  • 18
  • 20
  • related: [Calling the “source” command from subprocess.Popen](http://stackoverflow.com/q/7040592/4279) – jfs Nov 05 '15 at 23:39

6 Answers6

129

The problem with your approach is that you are trying to interpret bash scripts. First you just try to interpret the export statement. Then you notice people are using variable expansion. Later people will put conditionals in their files, or process substitutions. In the end you will have a full blown bash script interpreter with a gazillion bugs. Don't do that.

Let Bash interpret the file for you and then collect the results.

You can do it like this:

#! /usr/bin/env python

import os
import pprint
import shlex
import subprocess

command = shlex.split("env -i bash -c 'source init_env && env'")
proc = subprocess.Popen(command, stdout = subprocess.PIPE)
for line in proc.stdout:
  (key, _, value) = line.partition("=")
  os.environ[key] = value
proc.communicate()

pprint.pprint(dict(os.environ))

Make sure that you handle errors in case bash fails to source init_env, or bash itself fails to execute, or subprocess fails to execute bash, or any other errors.

the env -i at the beginning of the command line creates a clean environment. that means you will only get the environment variables from init_env. if you want the inherited system environment then omit env -i.

Read the documentation on subprocess for more details.

Note: this will only capture variables set with the export statement, as env only prints exported variables.

Enjoy.

Note that the Python documentation says that if you want to manipulate the environment you should manipulate os.environ directly instead of using os.putenv(). I consider that a bug, but I digress.

Lesmana
  • 25,663
  • 9
  • 82
  • 87
  • 13
    If you do care about non-exported variables and the script is outside of your control, you can use set -a to mark all variables as exported. Just change the command to: ['bash', '-c', 'set -a && source init_env && env'] – ahal Sep 25 '13 at 01:46
  • Note that this will fail on exported functions. I'd love it if you could update your answer showing parsing that works for functions too. (e.g. function fff() { echo "fff"; }; export -f fff) – D. A. Apr 14 '14 at 21:32
  • For Windows use: ['cmd', '/C', filename + ' && set'] worked like a charm – Sergio Martins Dec 29 '14 at 13:13
  • It would be good to remove whitespaces from key and values, like this, os.environ[key.strip()] = value.strip() – Prafulla Oct 17 '16 at 21:20
  • This doesn't work. Using "&&" causes the output of `env` to be hidden. It should be replaced with ";". – Cerin Jan 18 '17 at 10:10
  • 4
    Note: this does not support multiline environment variables. – BenC Jun 12 '17 at 16:08
  • 4
    In my case, iterating over `proc.stdout()` yields bytes, thus I was getting a `TypeError` on `line.partition()`. Converting to string with `line.decode().partition("=")` solved the problem. – Sam F Jun 13 '17 at 09:54
  • 1
    This was super helpful. I executed `['env', '-i', 'bash', '-c', 'source .bashrc && env']` to give myself only the environment variables set by the rc file – xaviersjs Sep 04 '18 at 22:09
  • If the `init_env` script returns a non-zero error code, the `source` command will fail. If this is the case, the `&&` operator will prevent `env` from being executed, which means that the whole command will fail. Replacing `&&` with `;` helps, as noted by @cerin. – peter.slizik Mar 01 '19 at 14:45
  • @peter.slizik If `init_env` returns an non-zero code, then there is an error in `init_env` and *that error should be fixed there* rather than ignore it by using `;`. – Louis Sep 25 '19 at 13:12
  • Very useful to know. Is it possible to source the python script into a parent bash shell and use python to set the environment variables for that shell? – openCivilisation Sep 05 '20 at 07:21
  • Couple of useful things to note: 1. This will not work as expected in the case given by OP, since the `env -i` will wipe out the `$PATH` variable. To preserve that, remove the `-i`. 2. Its better to check `if key == "_"` and avoid an update in that case, since `$_` is the script name. – Cnoor0171 Nov 12 '20 at 00:17
34

Using pickle:

import os, pickle
# For clarity, I moved this string out of the command
source = 'source init_env'
dump = '/usr/bin/python -c "import os,pickle;print pickle.dumps(os.environ)"'
penv = os.popen('%s && %s' %(source,dump))
env = pickle.loads(penv.read())
os.environ = env

Updated:

This uses json, subprocess, and explicitly uses /bin/bash (for ubuntu support):

import os, subprocess as sp, json
source = 'source init_env'
dump = '/usr/bin/python -c "import os, json;print json.dumps(dict(os.environ))"'
pipe = sp.Popen(['/bin/bash', '-c', '%s && %s' %(source,dump)], stdout=sp.PIPE)
env = json.loads(pipe.stdout.read())
os.environ = env
Brian
  • 1,026
  • 11
  • 18
  • This one has a problem on Ubuntu - the default shell there is `/bin/dash`, which does not know the `source` command. In order to use it on Ubuntu, you have to run `/bin/bash` explicitly, e.g. by using `penv = subprocess.Popen(['/bin/bash', '-c', '%s && %s' %(source,dump)], stdout=subprocess.PIPE).stdout` (this uses the newer `subprocess` module which has to be imported). – Martin Pecka May 13 '14 at 18:53
28

Rather than having your Python script source the bash script, it would be simpler and more elegant to have a wrapper script source init_env and then run your Python script with the modified environment.

#!/bin/bash
source init_env
/run/python/script.py
John Kugelman
  • 349,597
  • 67
  • 533
  • 578
  • 5
    It may solve the problem in some circumstances, but not all of them. For example I am writing a python script that needs to do something *like* sourcing the file (actually it loads modules if you know what I'm talking about), and it needs to load a *different* module depending on some circumstances. So this would not solve my problem at all – Davide Aug 25 '16 at 15:41
  • 1
    This does answer the question in most cases, and I would use it wherever possible. I had a hard time making this work in my IDE for a given project. One possible modification might be to run the whole thing in a shell with the environment `bash --rcfile init_env -c ./script.py` – xaviersjs Sep 04 '18 at 22:35
  • This is my preferred solution as well. However, many debuggers like Visual Studio Code or PyCharm do not allow to launch debuggable Python script this way. – Mikko Ohtamaa Sep 28 '22 at 09:06
8

Updated @lesmana's answer for Python 3. Notice the use of env -i which prevents extraneous environment variables from being set/reset (potentially incorrectly given the lack of handling for multiline env variables).

import os, subprocess
if os.path.isfile("init_env"):
    command = 'env -i sh -c "source init_env && env"'
    for line in subprocess.getoutput(command).split("\n"):
        key, value = line.split("=")
        os.environ[key]= value
Al Johri
  • 1,729
  • 22
  • 23
  • Using this gives me "PATH: undefined variable" because env -i unsets the path. But it does work without the env -i. Also be careful that the line might have multiple '=' – Fujii Jul 02 '18 at 22:56
7

Example wrapping @Brian's excellent answer in a function:

import json
import subprocess

# returns a dictionary of the environment variables resulting from sourcing a file
def env_from_sourcing(file_to_source_path, include_unexported_variables=False):
    source = '%ssource %s' % ("set -a && " if include_unexported_variables else "", file_to_source_path)
    dump = '/usr/bin/python -c "import os, json; print json.dumps(dict(os.environ))"'
    pipe = subprocess.Popen(['/bin/bash', '-c', '%s && %s' % (source, dump)], stdout=subprocess.PIPE)
    return json.loads(pipe.stdout.read())

I'm using this utility function to read aws credentials and docker .env files with include_unexported_variables=True.

JDiMatteo
  • 12,022
  • 5
  • 54
  • 65
0

Best workaround I found is like this :

  • Write a wrapper bash script that calls your python script
  • In that bash script you can source or call that script after sourcing your current terminal
Darshan Bhat
  • 245
  • 1
  • 11