0

I have a simple problem / question about the below code.

ip = '192.168.0.'
count = 0
while count <= 255:
    print(count)
    count += 1
    for i in range(10):
        ipg=ip+str(count)
        t = Thread(target=conn, args=(ipg,80))
        t.start()

I want to execute 10 threads each time and wait for it to finish and then continue with the next 10 threads until count <= 255

I understand my problem and why it does execute 10 threads for every count increase, but not how to solve it, any help would be appreciated.

martineau
  • 119,623
  • 25
  • 170
  • 301
r-d-r-b-3
  • 325
  • 2
  • 11
  • I guess I should add a Queue – r-d-r-b-3 Dec 22 '16 at 01:18
  • Consider using the largely undocumented [`ThreadPool`](http://stackoverflow.com/a/3386632/355230) class in `multiprocessing.pool` with `ThreadPool(processes=10)`. You could use `while not(all(a_thread.ready() for a_thread in results)): pass` to wait for all 10 threads to be idle each iteration. See [my answer](http://stackoverflow.com/a/18283388/355230) to another question. – martineau Dec 22 '16 at 01:27
  • Thanks for your answer, at the moment I'm working with a queue list but i guess the advantage with a pool is that after one process of the 10 is done it starts an other instead of waiting before all 10 have finsihed. Right ? – r-d-r-b-3 Dec 22 '16 at 01:31
  • True, but if you feed it 10 things at a time and then wait for them all to finish before feeding any more into it, you'll be able to accomplish what you want to do. – martineau Dec 22 '16 at 01:33

3 Answers3

0

There are two viable options: multiprocessing with ThreadPool as @martineau suggested and using queue. Here's an example with queue that executes requests concurrently in 10 different threads. Note that it doesn't do any kind of batching, as soon as a thread completes it picks up next task without caring the status of other workers:

import queue
import threading

def conn():
    try:
        while True:
            ip, port = que.get_nowait()
            print('Connecting to {}:{}'.format(ip, port))
            que.task_done()
    except queue.Empty:
        pass

que = queue.Queue()
for i in range(256):
    que.put(('192.168.0.' + str(i), 80))

# Start workers
threads = [threading.Thread(target=conn) for _ in range(10)]
for t in threads:
    t.start()

# Wait que to empty
que.join()

# Wait workers to die
for t in threads:
    t.join()

Output:

Connecting to 192.168.0.0:80
Connecting to 192.168.0.1:80
Connecting to 192.168.0.2:80
Connecting to 192.168.0.3:80
Connecting to 192.168.0.4:80
Connecting to 192.168.0.5:80
Connecting to 192.168.0.6:80
Connecting to 192.168.0.7:80
Connecting to 192.168.0.8:80
Connecting to 192.168.0.9:80
Connecting to 192.168.0.10:80
Connecting to 192.168.0.11:80
...
niemmi
  • 17,113
  • 7
  • 35
  • 42
0

I modified your code so that it has correct logic to do what you want. Please note that I don't run it but hope you'll get the general idea:

import time
from threading import Thread

ip = '192.168.0.'
count = 0
while count <= 255:
    print(count)
    # a list to keep your threads while they're running
    alist = []
    for i in range(10):
        # count must be increased here to count threads to 255
        count += 1
        ipg=ip+str(count)
        t = Thread(target=conn, args=(ipg,80))
        t.start()
        alist.append(t)

    # check if threads are still running
    while len(alist) > 0:
        time.sleep(0.01)
        for t in alist:
            if not t.isAlive():
                # remove completed threads
                alist.remove(t)
Organis
  • 7,243
  • 2
  • 12
  • 14
0

it can easily achieved using concurrents.futures library

here's the example code:

from concurrent.futures import ThreadPoolExecutor 

ip = '192.168.0.'
count = 0

THREAD_COUNT = 10

def work_done(future):
    result = future.result()
    # work with your result here


def main():
    with ThreadPoolExecutor(THREAD_COUNT) as executor:
        while count <= 255:
            count += 1
            ipg=ip+str(count)
            executor.submit(conn, ipg, 80).add_done_callback(work_done)

if __name__ == '__main__':
    main()

here executor returns future for every task it submits. keep in mind that if you use add_done_callback() finished task from thread returns to the main thread (which would block your main thread) if you really want true parallelism then you should wait for future objects separately. here's the code snippet for that.

from concurrent.futures import ThreadPoolExecutor
from concurrent.futures._base import wait

futures = []
with ThreadPoolExecutor(THREAD_COUNT) as executor:
    while count <= 255:
        count += 1
        ipg=ip+str(count)
        futures.append(executor.submit(conn, ipg, 80))
wait(futures)

for succeded, failed in futures:
    # work with your result here

hope this helps!

Asav Patel
  • 1,113
  • 1
  • 7
  • 25
  • In prefer this answer because ThreadPoolExecutor seems the best way to follow. Other answers where also correct. Thanks everyone ! – r-d-r-b-3 Dec 22 '16 at 03:10