I have an embedded system (Linux/Debian) in which I run several python scripts on different consoles. These scripts are Web Servers, so they communicate with each other and are not supposed to finish. Therefore the running it in different consoles.
I connect directly to this machine (ssh) and run the scripts in different consoles. But now I want to run those scripts (the same ones), on different machines, at the same time. I thought I could have a .sh
on each system, and call that via ssh, but even a .py
file can work.
Something like, init_all_processes.sh
:
#!/bin/bash
sudo python3 /home/proj1/webServer.py
sudo python3 /home/proj2/server/main.py
sudo python3 /home/proj1/test/test_server.py
sudo python3 /home/proj2/process_receiving_images.py
But by doing this and running init_all_processes.sh
on all the remote machines (
ssh root@MachineB 'bash -s' < init_all_processes.sh
using ssh for that, it does not matter if the script is in local, as all the pertinent code/repos are in those machines as well) from my host/local one, if it's a webserver (which is not just 'run and finish at some point', but because it will keep running until killed) it won't run the other scripts. I've seen that screen
can work.
Is there a direct way to run different scripts as if they were ran in different consoles/processes? I was first thinking on a master(host)/slaves(remote/servers) structure with ips via ssh. By doing it that way, I could even send parameters to each script (for example, num_epochs
, not important right now) for different results. But the problem remains in regards of each script not finishing (because they are not meant to), therefore not being able to run the following scripts. I do not need a log, if that is an issue.