I am trying to run consecutive jobs with help from python. Below is the detailed problem.
In the parent directory, there are ~1500 subdirectories consisting of two input files in each subdirectory. So total I have to run 3000 separate calculations. This is the first time I am trying to automate the job process one after another using python. For the test run, I am just trying with two input files (1.in and 2.in) put in a rough directory. I am trying to use a script for this purpose as,
import os,glob
path='/home/prasantb/rough/'
inputs=[]
for files in glob.glob('*.in'):
inputs.append(files)
print(inputs)
import subprocess
for i in inputs:
subprocess.call(['psi4', i],shell=True)
To run a psi4 job, I just have to use, psi4 abc.in which automatically writes abc.out and related files in the same directory. The output I am getting is,
Traceback (most recent call last):
File "/home/prasantb/psi4conda/bin/psi4", line 217, in <module>
raise KeyError("The file %s does not exist." % args["input"])
KeyError: 'The file input.dat does not exist.'
Traceback (most recent call last):
File "/home/prasantb/psi4conda/bin/psi4", line 217, in <module>
raise KeyError("The file %s does not exist." % args["input"])
KeyError: 'The file input.dat does not exist.'
Can I have anyones' kind help? Thank you in advance