I have following problem: I run celery with many workers. During celery startup I create few subprocesses:
proc = subprocess.Popen("program", stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE, cwd=cwd)
I need these subprocesses to start and be used later by celery workers (repeatedly). So I save subprocesses to multiprocessing.Manager().dict() - something like pool...
pool = multiprocessing.Manager().dict()
pool[proc_id] = proc
All subprocesses are accessible from celery workers, but they don't work - I found out that pipes are broken in a moment, when subprocess is shared via pool. So first question: is there any chance to share subprocess pipes between another processes (celery workers)?
I also tried to save pipes to separated regular dict. Then when worker gets subprocess from pool, these pipes are wired to subprocess:
proc.stdin = dict_of_pipes[proc_id]
This solution sometimes works, but sometimes pipe is not found in dictionary - I quess because sharing regular dictionary between processes is not ok?
As "program" you can imagine /bin/bash. Locking is solved, dictionaries are never accessed by more than 1 process at a time...
Second question - is it possible to open new pipe to subprocess? (from any celery worker?) Or other solution?