2

I have one Python script. Let's call it controller.py. I'd like to use controller.py to run another Python script and pass several variables to it. Let's call the second script analyzer.py.

What is the best way to do this without importing analyzer.py as module? And how do I reference the variables I'm passing to analyzer.py within that script?

Here's my failed attempt using subprocess:

controller.py

import subprocess

var1='mytxt'
var2=100
var3=True
var4=[['x','y','z'],['x','c','d']]
var5=r"C:\\Users\\me\\file.txt"

myargs=var1,var2,var3,var4,var5
my_lst_str = ' '.join(map(str, myargs))
my_lst_str ='python analyzer.py '+my_lst_str

subprocess.call(my_lst_str,shell=True)

analyzer.py

print 'Argument List:', str(sys.argv)

I have looked through similar questions on Stack Overflow. One oft-recommended solution I tried was importing analyzer.py as a module, but analyzer.py defines many different functions. Using it as a module creates lots of nested functions, and managing the scope of variables within those nested functions was cumbersome.

I need to use Python 2 for these scripts. I'm on a Windows 10 machine.

thgro
  • 29
  • 1
  • 1
  • 5
  • 2
    Does this answer your question? [What is the best way to call a script from another script?](https://stackoverflow.com/questions/1186789/what-is-the-best-way-to-call-a-script-from-another-script) – David May 16 '20 at 20:22
  • 1
    I don't really understand how importing analyzer.py as a module can create nested functions. Nested functions are functions that are defined within another function. Maybe it's worth sharing some actual code here. If you insist on not importing analyzer.py, maybe run it as a separate python process with [subprocess](https://docs.python.org/2/library/subprocess.html). – bartaelterman May 16 '20 at 20:31
  • You can use the `subprocess` module to run another Python script by specifying `sys.executable` as the program to run and pass it the path to the other .py script along with any command-line arguments desired. In the other script the command-line arguments will be in `sys.argv`. – martineau May 16 '20 at 21:19
  • 2
    Importing modules is normal, it shouldn't be cumbersome. – Peter Wood May 16 '20 at 21:36
  • @David It's similar but doesn't answer my exact question. Added a a code example. – thgro May 16 '20 at 23:43
  • @PeterWood and bartaelternan Importing the module itself was easy. But it meant that all of my code within analyzer.py became part of a function. Because of that, the other functions I was defining within analyzer.py became nested. This created issues because there are a lot of variables I want to remain global within analyzer.py. I'm a novice at creating modules and also at variable scopes of nested functions, so perhaps I'm misunderstanding something here. – thgro May 16 '20 at 23:47
  • 1
    @thgro Were you saying `from analyzer import *`? Because that would import everything. If you just want a single function from the module, import only that, e.g. `from analyzer import the_function`. Have as few global variables as possible (which is probably zero: constants are okay, but better to make them Enums). If a function needs a variable, pass it in as a parameter. If you keep passing the same parameters, group them into a [**`class`**](https://docs.python.org/3/tutorial/classes.html) or a [**`namedtuple`**](https://docs.python.org/3/library/collections.html#collections.namedtuple). – Peter Wood May 17 '20 at 09:27
  • 1
    Regarding your attempt to use `subprocess`: You can't just convert a list of lists into a string and then expect python to convert it back. You're doing things the hard way, but if you insist, look at [**`ast.literal_eval`**](https://docs.python.org/3/library/ast.html#ast.literal_eval) – Peter Wood May 17 '20 at 09:30
  • @PeterWood This is helpful best practice. In **analizer.py** I had some functions where I was calling a number of variables directly from the overall script rather than passing them in as parameters. This wasn't causing any issues until I turned the whole script into a module. But going forward I will instead pass them in as parameters. – thgro May 18 '20 at 00:02
  • 1
    @thgro it's not just *"best practice"*, it's **normal**. If you're doing something different, and having to jump through all these hoops to make your code reusable, there's something seriously wrong. Just fix your code and call the function you need. – Peter Wood May 18 '20 at 08:04
  • 1
    @PeterWood That’s exactly what i ended up doing. Thanks. – thgro May 18 '20 at 22:48
  • See this answer: https://stackoverflow.com/a/71967141/1364242 – Jay M Apr 22 '22 at 10:32

1 Answers1

9

1- exec command:

python2:

execfile('test.py')

python3:

exec(open('test.py').read())

2- os command:

test1.py:

import os 

#os.system('python test2.py')
os.system("python test2.py arg1 arg2")  

test2.py:

import sys

print 'Number of arguments:', len(sys.argv), 'arguments.'
print 'Argument List:', str(sys.argv)

3- subprocess command:

from subprocess import call
call(["python", "test.py"])

for passing argument and shell command use subprocess(plz see this Link):

import subprocess

# Simple command
subprocess.call(['ls', '-1'], shell=True)

another example code:

file1.py:

args ='python file2.py id ' + 1
subprocess.call(args)

file2.py:

import sys

print 'Number of arguments:', len(sys.argv), 'arguments.'
print 'Argument List:', str(sys.argv)

4- socket pogramming: share data between two or many python file you can use socket programming: see this Link.

keramat
  • 4,328
  • 6
  • 25
  • 38
Taher Fattahi
  • 951
  • 1
  • 6
  • 14
  • Thanks taher. I added some code to my question that attempts to use subprocess. Can you show me what I’m doing wrong with that code? – thgro May 17 '20 at 03:44
  • import sys in analyzer.py and remove shell=True in subprocess.call(my_lst_str,shell=True) => subprocess.call(my_lst_str) and then test your code – Taher Fattahi May 17 '20 at 04:17