0

I want to limit the number of active threads. What i have seen is, that a finished thread stays alive and does not exit itself, so the number of active threads keep growing until an error occours.

The following code starts only 8 threads at a time but they stay alive even when they finished. So the number keeps growing:

class ThreadEx(threading.Thread):
    __thread_limiter = None
    __max_threads = 2

    @classmethod
    def max_threads(cls, thread_max):
        ThreadEx.__max_threads = thread_max
        ThreadEx.__thread_limiter = threading.BoundedSemaphore(value=ThreadEx.__max_threads)

    def __init__(self, target=None, args:tuple=()):
        super().__init__(target=target, args=args)
        if not ThreadEx.__thread_limiter:
            ThreadEx.__thread_limiter = threading.BoundedSemaphore(value=ThreadEx.__max_threads)

    def run(self):
        ThreadEx.__thread_limiter.acquire()
        try:
            #success = self._target(*self._args)
            #if success: return True
            super().run()
        except:
            pass
        finally:
            ThreadEx.__thread_limiter.release()


def call_me(test1, test2):
    print(test1 + test2)
    time.sleep(1)


ThreadEx.max_threads(8)

for i in range(0, 99):
    t = ThreadEx(target=call_me, args=("Thread count: ", str(threading.active_count())))
    t.start()

Due to the for loop, the number of threads keep growing to 99. I know that a thread has done its work because call_me has been executed and threading.active_count() was printed.

Does somebody know how i make sure, a finished thread does not stay alive?

Mazdak
  • 105,000
  • 18
  • 159
  • 188
Rednib
  • 31
  • 4
  • https://stackoverflow.com/questions/323972/is-there-any-way-to-kill-a-thread-in-python , you might find this useful – RPT Feb 08 '18 at 10:57

2 Answers2

0

This may be a silly answer but to me it looks you are trying to reinvent ThreadPool.

from multiprocessing.pool import ThreadPool
from time import sleep

p = ThreadPool(8)

def call_me(test1):
    print(test1)
    sleep(1)

for i in range(0, 99):
    p.apply_async(call_me, args=(i,))

p.close()
p.join()

This will ensure only 8 concurrent threads are running your function at any point of time. And if you want a bit more performance, you can import Pool from multiprocessing and use that. The interface is exactly the same but your pool will now be subprocesses instead of threads, which usually gives a performance boost as GIL does not come in the way.

Hannu
  • 11,685
  • 4
  • 35
  • 51
0

I have changed the class according to the help of Hannu.

I post it for reference, maybe it's useful for others that come across this post:

import threading
from multiprocessing.pool import ThreadPool
import time

    class MultiThread():
    __thread_pool = None

    @classmethod
    def begin(cls, max_threads):
        MultiThread.__thread_pool = ThreadPool(max_threads)

    @classmethod
    def end(cls):
        MultiThread.__thread_pool.close()
        MultiThread.__thread_pool.join()

    def __init__(self, target=None, args:tuple=()):
        self.__target = target
        self.__args = args

    def run(self):
        try:
            result = MultiThread.__thread_pool.apply_async(self.__target, args=self.__args)
            return result.get()
        except:
            pass


def call_me(test1, test2):
    print(test1 + test2)
    time.sleep(1)
    return 0


MultiThread.begin(8)
for i in range(0, 99):
    t = MultiThread(target=call_me, args=("Thread count: ", str(threading.active_count())))
    t.run()
MultiThread.end()

The maximum of threads is 8 at any given time determined by the method begin. And also the method run returns the result of your passed function if it returns something.

Hope that helps.

Rednib
  • 31
  • 4