0

I have a pipeline composed of a number of scripts. Almost all of the scripts read and write to files, and some of the scripts in the pipeline have conflicting file formats (they will read and attempt to perform actions on files that they should not due to the files all having the same extension and being in the same directo). These scripts are executed from one wrapper script that calls them all in the appropriate sequence using subprocess.Popen. E.g.: subprocess.Popen("python file_processing.py " + i + " " + deliminator, shell=True).wait(). What I need to do is have all the files that are outputted from this Popen sent to a particular directory (a subdirecotry of the current directory). I've tried playing around with stdout, but no luck yet.

Cœur
  • 37,241
  • 25
  • 195
  • 267
Boss1295
  • 129
  • 1
  • 2
  • 9

1 Answers1

0

If you can't change the output filenames e.g., by passing them as a parameter to the subprocess or by specifying the output directory as a parameter then try to run the subprocesses in a different directory:

from subprocess import check_call

check_call(args, cwd="subdir")

Make sure that args use absolute paths so that the input files would be found.

A better alternative is to import the modules and call specific functions with necessary parameters instead.

Community
  • 1
  • 1
jfs
  • 399,953
  • 195
  • 994
  • 1,670