3

I have a script_A that handles different inputs using argparser that I use to perform a function. I now need script B to call and have all of script A run (have it handle different inputs) from inside. I am using Windows.

Goal: Script_B is going to do further analysis based on the output of script_A. Script_A's behavior changes according to the argument options passed. Script_B's behavior is always the same. I would rather not combine script_A and script_B into a massive script.

Goal Updated: In order for script_B to work well, I need to run script_A and then pass one of the dictionaries, dictionary D, that is calculated (output from A) to be passed on to B. The dictionary is only calculated until all of script_A runs.

This is what Script_A looks like

import sys
import os
from argparse import ArgumentParser

def function 1:
    #it does stuff....

def function 2:
    #it does other stuf...

if __name__ == "__main__":
    parser = ArgumentParser(description = "functionA.py -i [--print]
    parser.add_argument('-i', '--id', help="Please write A or B", required=True)
    parser.add_argument('-r', '--re', help="Please write C or D, required=True)

    sysargs = parser.parse_args()

    #Variable definitions

    if str(sysargs.id) == "A":
        #Uses file A located in directory X to perform analysis
        #calls function A to perform analysis
    elif str(sysargs.id) == "B":
        #Uses file B located in Directory Y to perform analysis
        #calls function B to perform analysis

    if str(sysargs.re) == "C"
        #Provides thorough printout of analysis (more in depth for debugging)
    if str(sysargs.re) == "D"
        #Does Nothing (no debugging option)

script A runs fine, it does its job when I use it. I use the command line argument to submit the inputs, required and sometimes optional.

This is what script B, I've tried the following:

1

import sys
import numpy as np
import os
import script_A

os.system("script_A.py", -i "A" -r "C")

#Other stuff that script B does

2

import sys
import os
import script_A

exec(script_A.py[-i "A" -r "C"])

#Other stuff that script B does

3

import os
import sys
from subprocess import call

subprocess.call("script_A.py", -i "A" -r "C")

#Other stuff that script B does

I've looked here: Calling an external command in Python

and here: importing a python script from another script and running it with arguments

but have not been able to figure it out from what they are saying. Any help is greatly appreciated. I am still a beginner to Python.

I have tried the following based on comments:

1

import subprocess
import script_A

p.subprocess.Popen("script_A.py", "-i", "A", "-r", "none", shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)


(stdoutput, erroutput) = p.communicate()

TypeError: __init_() got multiple values for keyword argument 'stdout'

I tried adding the self argument but I get the following error

p.subprocess.Popen("script_A.py", "-i", "A", "-r", "C", shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)

TypeError: __init_() got multiple values for keyword argument 'stderr'

2

import subprocess
import script_A

process= Popen(["script_A", "-i", "A", "-r", "C"], stdout = subprocess.PIPE, stderr=subprocess.STDOUT)

output = process.communicate()

OSError: [ERRno2] no such file or directory

error in the directory that goes to /subprocess.py in execute child

raise child_exception

i don't know what this means.

Community
  • 1
  • 1
  • 4
    Why do you need to pass the arguments via the command line? Just `import` and call the functions *directly*. – jonrsharpe May 31 '16 at 23:06
  • Sorry, could you elaborate a bit more on that? Just to clarify I don't need to run separate functions from script A in script B. I need to run the whole thing. I also would like to flexibility of parsing arguments for when I want to show more output vs when I don't. – python_rookie May 31 '16 at 23:46
  • So what do those alternatives do (or not do)? What are you expecting to happen? You don't need `import script_A` if you are not directly using its functions (as in the `os.system` version). – hpaulj Jun 01 '16 at 00:11
  • 1
    On your use of `argparse`, you don't need `str(sysargs.id)`. `args=parser.parse_args()` and `if args.id == 'A':...` should be enough. – hpaulj Jun 01 '16 at 00:13
  • Oops, sorry hpaulj I actually do have the args = parser.parse_args() in script A. Let me add it now. It still doesn't work with that line. I'll elaborate on the alternatives, in my next edit. Thank you – python_rookie Jun 01 '16 at 00:21
  • Do the functions not take a verbosity flag? Are both scripts yours? It would be much easier if the one you're calling had a sensible entry point so you could call it (and *"run the whole thing"*) without going back outside Python. Then the `__name__` conditional *only* parses args and uses them to call that one entry point. – jonrsharpe Jun 01 '16 at 06:21
  • Yes, both scripts are mine. I can add the verbose flag. I updated what I wanted to do. I want to pass a dictionary that is made whenever script A runs. Would you mind elaborating on how I could do that with verbose? – python_rookie Jun 01 '16 at 20:56

3 Answers3

1
p = subprocess.Popen("your_a_code_with_args_here",
                     shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)

(stdoutput, erroutput) = p.communicate()

Then your B code could do things according to the output of A code.

atline
  • 28,355
  • 16
  • 77
  • 113
  • Thanks a lot for the suggestion. I tried it and it is not working. I am having some troubles. I have updated my problem description to reflect what I am seeing now. I want to pass a dictionary that is made after the whole script is run. – python_rookie Jun 01 '16 at 20:58
  • Do not know how to exchange python object with pipe. Maybe you can not just return python object, use json.dumps to encode your object and return; while your B code could use json.loads to decode the json string and get the real python object from code A. – atline Jun 02 '16 at 12:25
1

I've condensed your A to:

import argparse

def function_1():
    print('function 1')

def function_2():
    print('function 2')

if __name__ == "__main__":
    parser = argparse.ArgumentParser()
    parser.add_argument('-i', '--id', choices=['A','B'], default='A')
    parser.add_argument('-r', '--re', choices=['C','D'], default='C')

    args = parser.parse_args()

    #Variable definitions

    if args.id == "A":
        print("A")
        function_1()
    elif args.id == "B":
        print('B')
        function_2()
    if args.re == "C":
        print('C')
    if args.re == "D":
        print('D')

If B is

import os
os.system("python3 stack37557027_A.py -i A -r C")

then A is run and displays

A
function 1
C

With the main block, all import stack375570237_A does is import the 2 functions, which could be run, from B with

stack375570237_A.function_A()

If that is unclear, reread documentation about the purpose of if __name__ == "__main__": and the difference between running a script and importing a module. That is important basic Python.

With @atline's suggestion:

import subprocess

p = subprocess.Popen("python3 stack37557027_A.py -i A -r C",
                     shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)

(stdoutput, erroutput) = p.communicate()
print(stdoutput)
#print(erroutput)

The display is:

b'A\nfunction 1\nC\n'
#None

Same thing, but wrapped as bytestring.

Note that in both cases I am invoking the script with one string, that includes the file name as well as the arguments.

p = subprocess.Popen(["python3", "stack37557027_A.py", "-i", "A", "-r", "C"],
                 stdout=subprocess.PIPE, stderr=subprocess.STDOUT)

This also works. shell is no longer True, and I using a list of strings.

You may need to review the docs for subprocess. https://pymotw.com/2/subprocess/

=====================

Your updated goal is:

, I need to run script_A and then pass one of the dictionaries, dictionary D, that is calculated (output from A) to be passed on to B. T

A problem with this is that the methods used so far either just display a string to stdout, or return a string to the caller (from stdout). They do not return a Python object like a dictionary.

Where in your A is this dictionary being produced? How does it depend on the argparse options?

Lets change A so a function in the body produces a dictionary

def function_1(id, re):
    print('function 1')
    return {'id':id, 're':re}

and call it with (in the 'if name' block):

args = parser.parse_args()
print(function_1(args.id, args.re))

The result is something like

1659:~/mypy$ python3 stack37557027_A.py 
function 1
{'re': 'C', 'id': 'A'}

Script B can produce a string display of this dictionary as well

1659:~/mypy$ python3 stack37557027_B.py
b"function 1\n{'re': 'C', 'id': 'A'}\n"

now if in B I do

import stack37557027_A as A
print(A.function_1('AA','BB'))

I get

{'id': 'AA', 're': 'BB'}

So if all the action is the body of A, and the if __name__ block just as the argparse parsing and delegation, I can use functions from A via import, and get back a dictionary (or what ever object those functions produce). In other words, I don't need invoke A as subprocess or system process.

If you really do need to run A as a separate process, and still get back Python objects, I think you can use multiprocessing.

https://docs.python.org/2/library/multiprocessing.html#exchanging-objects-between-processes

https://pymotw.com/2/multiprocessing/basics.html

hpaulj
  • 221,503
  • 14
  • 230
  • 353
0

Version 3 is almost correct. Although you have so many missing ' and " in your examples that I'm not sure if your code is working on your side...

from subprocess import call

# you need pass the name + all the arguments as a vector here 
process = call(["script_A.py", "-i", "A", "-r", "C"])

#Other stuff that script B does

Prefer to use Popen instead of call if you want to capture the output

from subprocess import call, Popen

# you need pass the name + all the arguments as a vector here 
process = Popen(["script_A.py", "-i", "A", "-r", "C"], stdout=subprocess.PIPE, stderr=subprocess.STDOUT)

output = process.communicate()  # let it run until finished

print output[0]  # stdout
print output[1]  # stderr
#Other stuff that script B does
RedX
  • 14,749
  • 1
  • 53
  • 76
  • I am having some trouble implementing this. I have updated what I've done to show what I am getting. Thank you for the help, much appreciated. – python_rookie Jun 01 '16 at 20:57
  • @python_rookie Your **1** is not passing the arguments as a list. Number **2** is trying to call `script_A` instead of `script_A.py`. Please pay more attention. Make sure `script_A.py` is within your path else you have to pass the full path to it. – RedX Jun 02 '16 at 05:05