1

I have implemented below code for multiprocessing that can handle multiple request concurrently but I'm getting below error. For that i use producer and consumer concept where producing putting process in queue and consumer consume that process and do some JOB.

Traceback (most recent call last):
p.start()
File "/usr/lib/python2.7/multiprocessing/process.py", line 130, in start
self._popen = Popen(self)
File "/usr/lib/python2.7/multiprocessing/forking.py", line 121, in __init__
self.pid = os.fork()
OSError: [Errno 12] Cannot allocate memory


queue = Queue()
lock = Lock()
producers = []
consumers = []
for frame in frames:
    `enter code here`producers.extend([Process(target=self.producer, args=(queue, lock, frame)) for i in xrange(cpu_count())])
for i in range(50):
    p = Process(target=self.consumer, args=(queue, lock))
    p.daemon = True
    consumers.append(p)

for p in producers:
    #time.sleep(random.randint(0, 5))
    p.start()

for c in consumers:
    #time.sleep(random.randint(0, 5))

c.start()

# Like threading, we have a join() method that synchronizes our program
for p in producers:
   p.join()

u_end = time.time()
print u_start, u_end
print('Parent process exiting...')
Adis
  • 4,512
  • 2
  • 33
  • 40
  • Please help how to achieve multiprocessing in python django by avoiding GIL – Anil Kumar Gupta Dec 19 '17 at 09:01
  • @Dominique the error is shown on top of the code – N. Ivanov Dec 19 '17 at 09:02
  • Yes error on the top and implementation is below of error – Anil Kumar Gupta Dec 19 '17 at 09:03
  • Possible duplicate of [Python cannot allocate memory using multiprocessing.pool](https://stackoverflow.com/questions/26717120/python-cannot-allocate-memory-using-multiprocessing-pool) – Sanket Dec 19 '17 at 09:07
  • 1
    Don't do this. There exists a perfectly good solution for running multiple consumers against a Django project: [Celery](http://www.celeryproject.org/). – Daniel Roseman Dec 19 '17 at 09:10
  • @ Daniel Roseman>>> I don't want to use third party package for same.any thing else apart from multiprocessing in python is please let me know – Anil Kumar Gupta Dec 19 '17 at 09:20
  • your `frames` iterable is probably really big, check it size and don't start a process for EACH element in it – Or Duan Dec 19 '17 at 09:34
  • My problem is that I'm processing a video converted into a list of clips and further each clips converted into frames now I want each frame information should be uploaded to bucket by using multi-process , because when I'm using iteration It take long time and when I'm using multiprocessing getting, Please suggest me thanks in advance. – Anil Kumar Gupta Dec 20 '17 at 02:21
  • Input_data>>>a video >>A function convert it into clips >>ClipToFrameFunction convert into frames >> Now I want Process list of frames through multi-process i.e {"clip1":[f1,f2,f3,.],"clip2":[f1,f2,f3,..],"clip3":[f1,f2,f3,...],clip4 ,.."clipN":[f1,f2,f3..]} – Anil Kumar Gupta Dec 20 '17 at 02:28

0 Answers0