I have a pipeline composed of a number of scripts. Almost all of the scripts read and write to files, and some of the scripts in the pipeline have conflicting file formats (they will read and attempt to perform actions on files that they should not due to the files all having the same extension and being in the same directo). These scripts are executed from one wrapper script that calls them all in the appropriate sequence using subprocess.Popen. E.g.:
subprocess.Popen("python file_processing.py " + i + " " + deliminator, shell=True).wait()
. What I need to do is have all the files that are outputted from this Popen sent to a particular directory (a subdirecotry of the current directory). I've tried playing around with stdout
, but no luck yet.
Asked
Active
Viewed 58 times
0
1 Answers
0
If you can't change the output filenames e.g., by passing them as a parameter to the subprocess or by specifying the output directory as a parameter then try to run the subprocesses in a different directory:
from subprocess import check_call
check_call(args, cwd="subdir")
Make sure that args
use absolute paths so that the input files would be found.
A better alternative is to import the modules and call specific functions with necessary parameters instead.
-
Thank you, this is just what I was looking for. – Boss1295 Jun 09 '15 at 23:24