I am looking to port a bash script that makes heavy use of environment variables to a python script. For a quick sanity check, I would like to follow a similar approach and define all environment variables (or a selection of them) as python global variables.
Is there a better way of doing this besides defining each variable one by one using VAR = os.getenv('VAR')
.
Update: I think this will do it. Contents of def_vars.py
#!/usr/bin/env python3
import os
def define_env_vars(env_vars=None):
""" Define all (select few) environment variables as python global variables
Args:
env_vars: list of selected environement variables to import, or None,
in which case all environment variables are imported
Returns:
None
"""
if env_vars == None:
env_vars = os.environ
else:
env_vars = { k: os.environ[k] for k in env_vars }
for k,v in env_vars.items():
globals()[k] = v
# Test
import_env_vars = ["PWD"]
define_env_vars(import_env_vars)
print(PWD)
But using this from another python module doesn't seem to work ??
Update 2 It does work now but only if the variables are prepended with the package path. I want to avoid this too. Contents of test.py
#!/usr/bin/env python3
import def_vars
# from def_vars import * #This doesn't work
env_vars = ["PWD"]
def_vars.define_env_vars(env_vars)
print(def_vars.PWD)
Update 3 This does what I needed it, and puts the variables on the caller module's globals()
#!/usr/bin/env python3
import os
import inspect
def define_env_vars(env_vars=None):
""" Define all (select few) environment variables as python global variables
of the caller module
Args:
env_vars: list of selected environement variables to import, or None,
in which case all environment variables are imported
Returns:
None
"""
if env_vars == None:
env_vars = os.environ
else:
env_vars = { k: os.environ[k] for k in env_vars }
for k,v in env_vars.items():
inspect.stack()[1][0].f_globals[k] = v