I have a list of multiple urls. I wanted to scrape them all using one function. I used the function below but the issue with it is after some time it uses up all the memory in the system which makes the system crash. I want a solution so that I can use 5-10 threads at one time.
from threading import Thread
threads = []
for url in input:
count += 1
_thread1 = Thread(target=self.hitandsave, args=(url, count))
_thread1.start()
threads.append(_thread1)
time.sleep(0.2)
for t in threads:
t.join()
I tried the codes above but I want a python code which iterates through the list and uses 5 to 10 threads at a time.
I tried this https://docs.python.org/3/library/concurrent.futures.html#concurrent.futures.ThreadPoolExecutor as well. but after some time system got hang and process get kill.