1

I have the following function that is used to execute system commands in Python:

def engage_command(
    command = None
    ):
    #os.system(command)
    return os.popen(command).read()

I am using the os module instead of the subprocess module because I am dealing with a single environment in which I am interacting with many environment variables etc.

How can I use Bash with this type of function instead of the default sh shell?

d3pd
  • 7,935
  • 24
  • 76
  • 127

1 Answers1

6
output = subprocess.check_output(command, shell=True, executable='/bin/bash')

os.popen() is implemented in terms of subprocess module.


I am dealing with a single environment in which I am interacting with many environment variables etc.

  1. each os.popen(cmd) call creates a new /bin/sh process, to run cmd shell command.

    Perhaps, it is not obvious from the os.popen() documentation that says:

    Open a pipe to or from command cmd

    "open a pipe" does not communicate clearly: "start a new shell process with a redirected standard input or output" -- your could report a documentation issue.

    If there is any doubt; the source confirms that each successful os.popen() call creates a new child process

  2. the child can't modify its parent process environment (normally).

Consider:

import os
#XXX BROKEN: it won't work as you expect
print(os.popen("export VAR=value; echo ==$VAR==").read())
print(os.popen("echo ==$VAR==").read())

Output:

==value==

====

==== means that $VAR is empty in the second command because the second command runs in a different /bin/sh process from the first one.

To run several bash commands inside a single process, put them in a script or pass as a string:

output = check_output("\n".join(commands), shell=True, executable='/bin/bash')

Example:

#!/usr/bin/env python
from subprocess import check_output

output = check_output("""
    export VAR=value; echo ==$VAR==
    echo ==$VAR==
    """, shell=True, executable='/bin/bash')
print(output.decode())

Output:

==value==
==value==

Note: $VAR is not empty here.

If you need to generate new commands dynamically (based on the output from the previous commands); it creates several issues and some of the issues could be fixed using pexpect module: code example.

Community
  • 1
  • 1
jfs
  • 399,953
  • 195
  • 994
  • 1,670
  • Thanks for your suggestion. As I mentioned in the question, I am interacting with a lot of environment variables and whatnot, so, for my purposes, the `os` module is what I am trying to use. – d3pd Nov 03 '15 at 15:23
  • 3
    @d3pd: using `os` module gives your nothing here. Click the link: `os.popen()` uses `subprocess` under the hood anyway. – jfs Nov 03 '15 at 15:24
  • @JFSebastian Ok, thanks for clarifying that. Do you suppose you could include an example of how you could use this approach to set and use environment variables? -- perhaps something like the following: `export MY_DIRECTORY="~/test"`, `ls "${MY_DIRECTORY}"` – d3pd Nov 03 '15 at 17:01
  • 1
    @d3pd: I've updated the answer to show how envvars behave in multiple commands. – jfs Nov 03 '15 at 17:05
  • @JFSabastian Thank you very very much for those clear and helpful details . Your solution clarifies the subprocess nature of `os.popen` and shows how to use Bash with subprocessing. (Would you happen to know if there is some way to approach the idea of maintaining the existence of the Bash environment of a subprocess that is created and accessed from Python?) – d3pd Nov 03 '15 at 17:24
  • @d3pd: I've added a link that shows how to use `pexpect`, to send multiple dynamic commands to and receive results from a single process. – jfs Nov 03 '15 at 17:29