3

I'm creating a python script that runs rsync using subprocess and then gets the stdout and print it. The script runs mulitple rsync process bases on a conf file using this code:

for share in shares.split(', '):
  username = parser.get(share, 'username')
  sharename = parser.get(share, 'name')
  local = parser.get(share, 'local')
  remote = parser.get(share, 'remote')
  domain = parser.get(share, 'domain')
  remotedir = username+"@"+domain+":"+remote
  rsynclog = home + "/.bareshare/"+share+"rsync.log"
  os.system("cp "+rsynclog+" "+rsynclog+".1 && rm "+rsynclog) # MOve and remove old log
  rsync="rsync --bwlimit="+upload+" --stats --progress -azvv -e ssh "+local+" "+username+"@"+domain+":"+remote+" --log-file="+rsynclog+" &"
  # Run rsync of each share
  #         os.system(rsync) 
  self.rsyncRun = subprocess.Popen(["rsync","--bwlimit="+upload,"--stats","--progress","-azvv","-e","ssh",local,remotedir,"--log-file="+rsynclog], stdout=subprocess.PIPE, stderr=subprocess.PIPE)

I thinkg that this might not be the best thing to do - running multiple syncs at the sime time. How could I set up this so I wait for one process to finish before next one starts?

You can find my complete script here: https://github.com/danielholm/BareShare/blob/master/bareshare.py

Edit: And How do I make self.rsyncRun to die when done? When rsync is done with all the files, it seems like it continues altough it shouldnt be doing that.

Daniel Holm
  • 4,347
  • 3
  • 19
  • 27

1 Answers1

4

Calling

self.rsyncRun.communicate()

will block the main process until the rsyncRun process has finished.


If you do not want the main process to block, then spawn a thread to handle the calls to subprocess.Popen:

import threading

def worker():
    for share in shares.split(', '):
        ...
        rsyncRun = subprocess.Popen(...)
        out, err = rsyncRun.communicate()

t = threading.Thread(target = worker)
t.daemon = True
t.start()
t.join()
unutbu
  • 842,883
  • 184
  • 1,785
  • 1,677
  • Sorry, I dont want the main process to wait, just the rsync processes to wait for each other. – Daniel Holm Jan 26 '12 at 20:01
  • You could use the threading module to spawn a thread which handles the calls to `subprocess.Popen`. Then you could call `self.rsyncRun.communicate()` in the thread without blocking the main process. – unutbu Jan 26 '12 at 20:03
  • So, I've added this function but I don't know how do get the output from the command anymore? – Daniel Holm Jan 30 '12 at 13:52
  • 1
    If you are willing to wait until `rsyncRun` is done, then you can simply access `out` and `err` after `t.join()`. But if you wish to see the output written to stdout and stderr by the `rsync` command as it is being produced, then you may need to use `select.select` (or something similar). See this [SO answer](http://stackoverflow.com/a/7730201/190597) for an example of how to do this. – unutbu Jan 30 '12 at 18:43
  • Thanks, I'll have a look. EDIT: Just one more question: How do I kill the worker? It continues altough I send "gtk.main_quit()" I have to both click on exit and press Ctrl+C to terminate the script. – Daniel Holm Jan 31 '12 at 12:30
  • 1
    Adding `t.daemon = True` before `t.start()` will cause the thread to terminate when the main process ends. – unutbu Jan 31 '12 at 13:30