Context: I'm trying to migrate code from Matlab to python. Where in matlab, if you put two files under the same folder, the only thing you need to do to run commend lines or funciton from other script, is to simply type the file name. i.e. in file1.m if I type file2.m it will automatically run all lines of code from file2.m and with global variables shared in the same work space.
I tried to do the similiar things in python. However, even after read the posts online, I'm still a little confused.
Related posts:
From :
Run a python script from another python script, passing in args
What is the best way to call a script from another script?
Difference between subprocess.Popen and os.system
Calling a python script with input within a python script using subprocess
So I have the following options each seemed to have its unique strength, but the posts did not make a clear comparation between all of them.
import
os.system
os.popen()
subprocess.Popen()
subprocess.call()
execfile
or simply
exec(open(dir+'\\a.py').read())
print(x) #outputs 1
The way I needed it to work is to use let file2.py us the same variables in file1.py, and that the variables generated by file2.py can be read by file1.py freely just like running a sequence of commend in one file, or that running it in same workspace in Matlab.
Some people mentioned that execfile can not pass argument??? thus the best way is to use os.system, some other people also mentioned that subprocess is better than os.system, but I'm not sure if using subprocess will affect the usage of varibles differently?? also somepeople used subprocess.Propen() and someone used subprocess.call(). But the way I looked, exec(open(dir+'\a.py').read()) seemed to be the closest way that I was thinking for this process.
Which one should I be using and why it's better?
Can you make a "catalog" or table bascially clearfing which one should I be using what type of circumstance?