I'm trying to make actions with Python requests. Here is my code:
import threading
import resource
import time
import sys
#maximum Open File Limit for thread limiter.
maxOpenFileLimit = resource.getrlimit(resource.RLIMIT_NOFILE)[0] # For example, it shows 50.
# Will use one session for every Thread.
requestSessions = requests.Session()
# Making requests Pool bigger to prevent [Errno -3] when socket stacked in CLOSE_WAIT status.
adapter = requests.adapters.HTTPAdapter(pool_maxsize=(maxOpenFileLimit+100))
requestSessions.mount('http://', adapter)
requestSessions.mount('https://', adapter)
def threadAction(a1, a2):
global number
time.sleep(1) # My actions with Requests for each thread.
print number = number + 1
number = 0 # Count of complete actions
ThreadActions = [] # Action tasks.
for i in range(50): # I have 50 websites I need to do in parallel threads.
a1 = i
for n in range(10): # Every website I need to do in 3 threads
a2 = n
ThreadActions.append(threading.Thread(target=threadAction, args=(a1,a2)))
for item in ThreadActions:
# But I can't do more than 50 Threads at once, because of maxOpenFileLimit.
while True:
# Thread limiter, analogue of BoundedSemaphore.
if (int(threading.activeCount()) < threadLimiter):
item.start()
break
else:
continue
for item in ThreadActions:
item.join()
But the thing is that after I get 50 Threads up, the Thread limiter
starting to wait for some Thread to finish its work. And here is the problem. After scrit went to the Limiter, lsof -i|grep python|wc -l
is showing much less than 50 active connections. But before Limiter it has showed all the <= 50 processes. Why is this happening? Or should I use requests.close() instead of requests.session() to prevent it using already oppened sockets?