I'm working on a project that basically requires me to process a little under 10,000 IP addresses using ping. Unfortunately, I'm running on an 8gb machine and crash if I try to process somewhere near 2,000 IPs. The following is my code (that I got a majority of from Fast ping sweep in python):
logs = open('addresses.txt', 'r').readlines()
pool = [ multiprocessing.Process(target=pinger, args=(jobs,results)) for log in logs ]
for p in pool:
p.start()
for i in logs:
jobs.put(i)
for p in pool:
jobs.put(None)
for p in pool:
p.join()
I'm very new to multiprocessing, but I was wondering if there was some way I can still use it but assign jobs for a fraction of the logs so as to salvage memory at the expense of time, such that when jobs complete, they can be reassigned to unprocessed logs. Sorry if this is unclear -- again, new to this.