68

I was hoping to write a python script to create some appropriate environmental variables by running the script in whatever directory I'll be executing some simulation code, and I've read that I can't write a script to make these env vars persist in the mac os terminal. So two things:

Is this true?

and

It seems like it would be a useful things to do; why isn't it possible in general?

physicsmichael
  • 4,793
  • 11
  • 35
  • 54
  • Duplicate: http://stackoverflow.com/questions/488366/how-do-i-make-environment-variable-changes-stick-in-python, http://stackoverflow.com/questions/235435/environment-variables-in-python-on-linux – S.Lott Apr 03 '09 at 23:01
  • 1
    This is more of a "It seems like this would be useful. Why isn't it possible?" – physicsmichael Apr 03 '09 at 23:05
  • But it doesn't matter. If I said it was a legal restriction, how does that change anything? You still can't do it. If I said it was an element of the Methodist Book of Discipline, how does that change anything? You still can't do it. – S.Lott Apr 03 '09 at 23:07
  • @vgm64 I think if you elaborate on the nature of the environment variables you are trying to set, we can collectively find an nice alternative solution using simple shell scripts. Python gets in the way because the interpreter is a separate process with its own env. You'd have to fork from Python... – Joe Holloway Apr 03 '09 at 23:33
  • 1
    "Why can't I shoplift?" and "Can I shoplift?" are two separate questions. I'm asking the former, but Benson has given a good technical answer (and surprisingly, a solution!). – physicsmichael Apr 03 '09 at 23:35
  • @jholloway7 You are correct. My situation requires an environmental variable to be set for whatever directory I'll be doing some analysis in. My solution will be creating the alias: "alias geant4cwd='export $G4WORKDIR=`pwd`'" Perfect solution. – physicsmichael Apr 03 '09 at 23:37
  • To change envvars in another proccess (like, the parent shell), on Linux there is an answer here> http://unix.stackexchange.com/questions/38205/change-environment-of-a-running-process – jsbueno Aug 19 '15 at 12:15

9 Answers9

43

You can't do it from python, but some clever bash tricks can do something similar. The basic reasoning is this: environment variables exist in a per-process memory space. When a new process is created with fork() it inherits its parent's environment variables. When you set an environment variable in your shell (e.g. bash) like this:

export VAR="foo"

What you're doing is telling bash to set the variable VAR in its process space to "foo". When you run a program, bash uses fork() and then exec() to run the program, so anything you run from bash inherits the bash environment variables.

Now, suppose you want to create a bash command that sets some environment variable DATA with content from a file in your current directory called ".data". First, you need to have a command to get the data out of the file:

cat .data

That prints the data. Now, we want to create a bash command to set that data in an environment variable:

export DATA=`cat .data`

That command takes the contents of .data and puts it in the environment variable DATA. Now, if you put that inside an alias command, you have a bash command that sets your environment variable:

alias set-data="export DATA=`cat .data`"

You can put that alias command inside the .bashrc or .bash_profile files in your home directory to have that command available in any new bash shell you start.

Benson
  • 22,457
  • 2
  • 40
  • 49
20

One workaround is to output export commands, and have the parent shell evaluate this..

thescript.py:

import pipes
import random
r = random.randint(1,100)
print("export BLAHBLAH=%s" % (pipes.quote(str(r))))

..and the bash alias (the same can be done in most shells.. even tcsh!):

alias setblahblahenv="eval $(python thescript.py)"

Usage:

$ echo $BLAHBLAH

$ setblahblahenv
$ echo $BLAHBLAH
72

You can output any arbitrary shell code, including multiple commands like:

export BLAHBLAH=23 SECONDENVVAR='something else' && echo 'everything worked'

Just remember to be careful about escaping any dynamically created output (the pipes.quote module is good for this)

dbr
  • 165,801
  • 69
  • 278
  • 343
  • What if we want to set multiple environment variables and provide some feedback on the execution like success or failure ? – chmike Jan 22 '13 at 09:54
  • 1
    @chmike the `export` command allows for multiple variables like `export BLAH=123 ANOTHER=321 ANDMORE=434`. For success/failure message, you can output any command and it'll be executed (like an `echo` command) – dbr Jan 22 '13 at 12:51
5

If you set environment variables within a python script (or any other script or program), it won't affect the parent shell.

Edit clarification: So the answer to your question is yes, it is true. You can however export from within a shell script and source it by using the dot invocation

in fooexport.sh

export FOO="bar"

at the command prompt

$ . ./fooexport.sh
$ echo $FOO
bar
Stefano Borini
  • 138,652
  • 96
  • 297
  • 431
  • So what about writing an alias that 1) runs a python script in cwd that creates a shell script with environmental variables i need based on the cwd and 2) invokes that script as you've written? – physicsmichael Apr 03 '09 at 23:09
  • Stefano, I checked out the Papers application mentioned on ForTheScience.org. I'm glad you decided to answer my question! If I obtain the student discount it will definitely worth the price, I suspect, so thank you. – physicsmichael Apr 03 '09 at 23:48
  • oh yes, the alias as you said will work, because it is invoked your current shell. That's definitely a solution to your issue. – Stefano Borini Apr 04 '09 at 00:23
3

It's not generally possible. The new process created for python cannot affect its parent process' environment. Neither can the parent affect the child, but the parent gets to setup the child's environment as part of new process creation.

Perhaps you can set them in .bashrc, .profile or the equivalent "runs on login" or "runs on every new terminal session" script in MacOS.

You can also have python start the simulation program with the desired environment. (use the env parameter to subprocess.Popen (http://docs.python.org/library/subprocess.html) )

import subprocess, os
os.chdir('/home/you/desired/directory')
subprocess.Popen(['desired_program_cmd', 'args', ...], env=dict(SOMEVAR='a_value') )

Or you could have python write out a shell script like this to a file with a .sh extension:

export SOMEVAR=a_value
cd /home/you/desired/directory
./desired_program_cmd

and then chmod +x it and run it from anywhere.

Joe Koberg
  • 25,416
  • 6
  • 48
  • 54
  • The simulation software I'm using requires a certain environmental variable to be set **by hand** for any directory you want to do an analysis in. I see this as a terrible approach and thought to fix it, or at least automate it, with python. – physicsmichael Apr 03 '09 at 23:07
2

What I like to do is use /usr/bin/env in a shell script to "wrap" my command line when I find myself in similar situations:

#!/bin/bash

/usr/bin/env NAME1="VALUE1" NAME2="VALUE2" ${*}

So let's call this script "myappenv". I put it in my $HOME/bin directory which I have in my $PATH.

Now I can invoke any command using that environment by simply prepending "myappenv" as such:

myappenv dosometask -xyz

Other posted solutions work too, but this is my personal preference. One advantage is that the environment is transient, so if I'm working in the shell only the command I invoke is affected by the altered environment.

Modified version based on new comments

#!/bin/bash

/usr/bin/env G4WORKDIR=$PWD ${*}

You could wrap this all up in an alias too. I prefer the wrapper script approach since I tend to have other environment prep in there too, which makes it easier for me to maintain.

Joe Holloway
  • 28,320
  • 15
  • 82
  • 92
1

As answered by Benson, but the best hack-around is to create a simple bash function to preserve arguments:

upsert-env-var (){ eval $(python upsert_env_var.py $*); }

Your can do whatever you want in your python script with the arguments. To simply add a variable use something like:

var = sys.argv[1]
val = sys.argv[2]
if os.environ.get(var, None):
    print "export %s=%s:%s" % (var, val, os.environ[var])
else:
    print "export %s=%s" % (var, val)

Usage:

upsert-env-var VAR VAL
Niall
  • 41
  • 5
1

As others have pointed out, the reason this doesn't work is that environment variables live in a per-process memory spaces and thus die when the Python process exits.

They point out that a solution to this is to define an alias in .bashrc to do what you want such as this:

alias export_my_program="export MY_VAR=`my_program`"

However, there's another (a tad hacky) method which does not require you to modify .bachrc, nor requires you to have my_program in $PATH (or specify the full path to it in the alias). The idea is to run the program in Python if it is invoked normally (./my_program), but in Bash if it is sourced (source my_program). (Using source on a script does not spawn a new process and thus does not kill environment variables created within.) You can do that as follows:

my_program.py:

#!/usr/bin/env python3

_UNUSED_VAR=0
_UNUSED_VAR=0 \
<< _UNUSED_VAR
#=======================
# Bash code starts here
#=======================
'''
_UNUSED_VAR
export MY_VAR=`$(dirname $0)/my_program.py`
echo $MY_VAR
return
'''

#=========================
# Python code starts here
#=========================
print('Hello environment!')

Running this in Python (./my_program.py), the first 3 lines will not do anything useful and the triple-quotes will comment out the Bash code, allowing Python to run normally without any syntax errors from Bash.

Sourcing this in bash (source my_program.py), the heredoc (<< _UNUSED_VAR) is a hack used to "comment out" the first-triple quote, which would otherwise be a syntax error. The script returns before reaching the second triple-quote, avoiding another syntax error. The export assigns the result of running my_program.py in Python from the correct directory (given by $(dirname $0)) to the environment variable MY_VAR. echo $MY_VAR prints the result on the command-line.

Example usage:

$ source my_program.py
Hello environment!
$ echo $MY_VAR
Hello environment!

However, the script will still do everything it did before except exporting, the environment variable if run normally:

$ ./my_program.py
Hello environment!
$ echo $MY_VAR
                                <-- Empty line
nijoakim
  • 930
  • 10
  • 25
0

As noted by other authors, the memory is thrown away when the Python process exits. But during the python process, you can edit the running environment. For example:

>>> os.environ["foo"] = "bar"
>>> import subprocess
>>> subprocess.call(["printenv", "foo"])
bar
0
>>> os.environ["foo"] = "foo"
>>> subprocess.call(["printenv", "foo"])
foo
0
macetw
  • 1,640
  • 1
  • 17
  • 26
0

If it is possible to write to a file, another workaround could be to use source command. Make the Python script write the output to a file which can be used further to export variables. I have used this on Linux though. For example:

with open("vars", "w") as f:
  f.write('export foo=bar\n')
  f.write('export fruit="apple"\n')

Then in the simulation code:

source vars

The variables will be available at least for the given shell.