Popen(cmd, stdout=PIPE, stderr=PIPE)
won't "lock" your parent process.
cmd
may stall itself if it generates enough output due to the full pipe buffers. If you want to discard subprocess' output then use DEVNULL
instead of PIPE
:
import os
from subprocess import Popen, STDOUT
DEVNULL = open(os.devnull, 'wb') #NOTE: it is already defined in Python 3.3+
p = Popen(cmd, stdout=DEVNULL, stderr=STDOUT)
# ...
if you want to process the output without blocking the main thread then you could use several approaches: fcntl
, select
, named pipes with iocp, threads. The latter is a more portable way:
p = Popen(cmd, stdout=PIPE, stderr=PIPE, bufsize=-1)
bind(p.stdout, stdout_callback)
bind(p.stderr, stderr_callback)
# ...
where bind()
function:
from contextlib import closing
from functools import partial
from threading import Thread
def bind(pipe, callback, chunksize=8192):
def consume():
with closing(pipe):
for chunk in iter(partial(pipe.read, chunksize), b''):
callback(chunk)
t = Thread(target=consume)
t.daemon = True
t.start()
return t
You don't need an external process to copy a file in Python without blocking the main thread:
import shutil
from threading import Thread
Thread(target=shutil.copy, args=['source-file', 'destination']).start()
Python can release GIL during I/O so the copying happens both concurrently and in parallel with the main thread.
You could compare it with a script that uses multiple processes:
import shutil
from multiprocessing import Process
Process(target=shutil.copy, args=['source-file', 'destination']).start()
If you want to cancel the copying when your program dies then set thread_or_process.daemon
attribute to True
.