I have a RabbitMQ message queue where I add configuration.json
to the queue. Each of these configurations' has all the information required to start a workflow. So if I have 5 messages in the queue, I need to feed the configuration 5 commands through my consumer which starts the workflow.
If I exit the python script that spawns these processes, I exit the actual workflow process as well. And that is the issue.
Currently:
subprocess.run(f'nextflow run nextflow/workflow_runner.nf -w "gs://nextflow-text-bucket/work" -c nextflow/nextflow.config --at_config {config_file} --csv {csv_dest} &', shell=True)
Over here I simply add an &
symbol to the end of the command so that it runs in the background. How can I alter this code so that my python script, creates a new process which is disconnected from this consumer.
BASICALLY, How can I efficiently create n number of processes, using a python script, which are not dependant on the script that started them. And I have the ability to log on to the machine and connect to the output of each of the process.
This is what the output looks like:
N E X T F L O W ~ version 20.01.0
Launching `nextflow/workflow_runner.nf` [pedantic_pauling] - revision: a8fe3d5045
executor > google-lifesciences (1)
[92/828e5d] process > prc1 [ 0%] 0 of 1
[- ] process > prc2 -
[- ] process > prc3 -
[- ] process > prc4 -
[- ] process > prc5 -
And is updated as each process completes. So having the ability to start this off in the bg, but be able to "attach" on to it and check where it's reached.