0

I had a few manually started processes (with p.start()) for dealing with some background tasks, and I communicated with them via multiprocessing.Pipe(). So far, so good.

Now, I have to scale my application in a situation that, following the same structure, too many processes would be started.

So, I'm trying to port my code from having some manually started multiprocessing.Process's to a pool of processes. The problem is that multiprocessing.Pipe() does not seem to work with them. It seems that I should have to use a queue.

Specifically, I was using the code suggested in this stackovervlow answer to run some generators in background, but the problem is that now I have many generators.

Many thanks.

nohamk
  • 325
  • 2
  • 11
  • Are you aware of [Process Pools](https://docs.python.org/3/library/multiprocessing.html?highlight=multiprocessing%20pool#module-multiprocessing.pool) ? – stovfl Jun 03 '20 at 10:47
  • @stovfl Yes, in fact, I mentioned it in the question. What I want to do is to use multiprocess.Pipe (or the closest equivalent) with multiprocessing.Pool – nohamk Jun 03 '20 at 11:46
  • I doubt, that approach is restricted, read [Why are Python multiprocessing Pipe unsafe?](https://stackoverflow.com/a/12484615/7414759) – stovfl Jun 03 '20 at 11:53
  • @stovfl I see, thanks for your answer! – nohamk Jun 03 '20 at 14:14

0 Answers0