5

I was wondering how I could limit something like this too use only 10 threads at one time

with open("data.txt") as f:
    for line in f:
        lines = line.rstrip("\n\r")
        t1 = Thread(target=Checker, args=("company"))
        t1.start()
Yevhen Kuzmovych
  • 10,940
  • 7
  • 28
  • 48
  • Just monitor the threads alive and block until one of the threads exits if there are 10 of them or more. – ForceBru Jan 21 '17 at 15:28
  • Possible duplicate of [How do I limit the number of active threads in python?](http://stackoverflow.com/questions/1787397/how-do-i-limit-the-number-of-active-threads-in-python) – zondo Jan 21 '17 at 15:30
  • Possible duplicate of [Python thread pool similar to the multiprocessing Pool?](http://stackoverflow.com/questions/3033952/python-thread-pool-similar-to-the-multiprocessing-pool) – Amit Kumar Jan 21 '17 at 15:32
  • Does this answer your question? [How to limit number of concurrent threads in Python?](https://stackoverflow.com/questions/18347228/how-to-limit-number-of-concurrent-threads-in-python) – feetwet Aug 29 '23 at 01:36

4 Answers4

7

Use Python's ThreadPoolExecutor with max_workers argument set to 10.

Something like this:`

pool = ThreadPoolExecutor(max_workers=10)
with open("data.txt") as f:
    for line in f:
        lines = line.rstrip("\n\r")
        pool.submit(Checker,"company")

pool.shutdown(wait=True)

The pool will automatically allocate threads as needed, limiting maximum number of allocation to 10. The first argument in pool.submit() is the function name, the arguments are simply passed as comma-separated values.

pool.shutdown(wait=True) waits for all threads to complete execution.

Shihab Shahriar Khan
  • 4,930
  • 1
  • 18
  • 26
2

Use the ThreadPoolExecutor and tell it that you want 10 threads.

def your_function_processing_one_line(line):
    pass  # your computations

with concurrent.futures.ThreadPoolExecutor(10) as executor:
    result = executor.map(your_function_processing_one_line, [line for line in f])

...and you will have all the results in result.

yogabonito
  • 657
  • 5
  • 14
  • what if multiple params ? – Raheel Dec 12 '17 at 07:43
  • 1
    This is also possible. Take a look at [this nice answer](https://stackoverflow.com/questions/6785226/pass-multiple-parameters-to-concurrent-futures-executor-map/6976772#6976772). – yogabonito Dec 30 '17 at 12:05
1

I wrote this nested loop to cap threads to a variable. This code relies on a preset array of commands to process. I have borrowed some elements from other answers for thread launch.

import os, sys, datetime, logging, thread, threading, time
from random import randint

# set number of threads
threadcount = 20

# alltests is an array of test data

numbertests = len(alltests)
testcounter = numbertests

# run tests
for test in alltests:
    # launch worker thread
    def worker():
        """thread worker function"""
        os.system(command)
        return
    threads = []
    t = threading.Thread(target=worker)
    threads.append(t)
    t.start()
    testcounter -= 1
    # cap the threads if over limit
    while threading.active_count() >= threadcount:
        threads = threading.active_count()
        string = "Excessive threads, pausing 5 secs - " + str(threads) 
        print (string)
        logging.info(string)
        time.sleep(5)

# monitor for threads winding down
while threading.active_count() != 1:
    threads = threading.active_count()
    string = "Active threads running - " + str(threads) 
    print (string)
    logging.info(string)
    time.sleep(5)
Tarek Ali
  • 7
  • 2
-1

(for both Python 2.6+ and Python 3)

Use the threadPool from multiprocessing module:

from multiprocessing.pool import ThreadPool

The only thing is that it is not well documented...

mguijarr
  • 7,641
  • 6
  • 45
  • 72