I try some multiprocess examples, mainly : http://toastdriven.com/blog/2008/nov/11/brief-introduction-multiprocessing/ where I have taken the 'simple application', which use multiprocess to test URLs. When I use it (in Python 3.3, on Windows in PyCharm IDE) with some modifications, with a lot of URLs, my script never stop, and I don't see why.
import httplib2
import sys
from multiprocessing import Lock, Process, Queue, current_process
def worker(work_queue, done_queue):
for url in iter(work_queue.get, 'STOP'):
try:
print("In : %s - %s." % (current_process().name, url))
status_code = print_site_status(url)
done_queue.put("%s - %s got %s." % (current_process().name, url, status_code))
except:
done_queue.put("%s failed on %s with: %s" % (current_process().name, url, str(sys.exc_info()[0])))
print("Out : %s " % (current_process().name))
return True
def print_site_status(url):
http = httplib2.Http(timeout=10)
headers, content = http.request(url)
return headers.get('status', 'no response')
def main():
workers = 8
work_queue = Queue()
done_queue = Queue()
processes = []
with open("Annu.txt") as f: # file with URLs
lines = f.read().splitlines()
for surl in lines:
work_queue.put(surl)
for w in range(workers):
p = Process(target=worker, args=(work_queue, done_queue))
p.start()
processes.append(p)
work_queue.put('STOP')
for p in processes:
p.join()
print("END")
done_queue.put('STOP')
for status in iter(done_queue.get, 'STOP'):
print(status)
if __name__ == '__main__':
main()
I well see all the URLs status tested, and all the process 'Out' message that indicate hte end of the process, but never my 'END' message. A list of URLs I use is : http://www.pastebin.ca/2946850 .
So ... where is my error ? Is it a duplicate with : Python multiprocessing threads never join when given large amounts of work ?
Some informations : when I suppress 'done_queue' everywhere in the code : it's works.