0

I am looking for a way to run a python script on a set of remote hosts in parallel. The script would have the same name each host, but I need it to run on every host at the same time. Can I use Pool:multiprocessing to accomplish this? I am thinking about something like the following:

#!/usr/bin/env python3

import csv
import os
from multiprocessing import Pool
from os import path

remoteScript = '/app/DR_Autoomation/DR_start_processes.py'
start1 = 'hostIP1:remoteScript'
start2 = 'hostIP2:remoteScript'
start3 = 'hostIP2:remoteScript'

hosts = (start1, start2, start3)

def start_host_processes(node_process_start):
    os.system('python {}'.format(node_process_start))

start_pool = Pool(hosts=3)
start_pool.map(start_host_processes, hosts)

I am not sure how to "attach" the "remoteScript" variable to hostIP variables, but the above script is a general outline of what I need to accomplish. The reason for needing to run the "remoteScript" is because I need to meet a one hour SLA to have the environment up and running.

ajhowey
  • 41
  • 4
  • Python's `multiprocessing` module is for a single host. But there are lots of ways to run distributed commands - though it can be OS specific. `pdsh` from a linux command line comes to mind. A noticed https://stackoverflow.com/questions/26876898/python-multiprocessing-with-distributed-cluster has some solutions. This is more a question for an internet search engine than stackoverflow. – tdelaney May 04 '22 at 20:46
  • If the target is a system supporting SSH, you could use a thread pool to run paramiko connections to the remote hosts. – tdelaney May 04 '22 at 20:53

0 Answers0