I'm using the Python requests package to make a large number of requests to an API. At some point my program however crashes due to 'too many open files'. As I explicitely close my session I do not really know how this can be.
I use the following code:
import requests
import multiprocessing
import numpy as np
s = requests.session()
s.keep_alive = False
def request(i, mapId, minx, maxx, miny, maxy):
print(i)
try:
with requests.Session() as s:
r = s.post(
url + "metadata/polygons",
timeout=10,
json={
"mapId": mapId,
"layer": "percelen",
"xMin": minx,
"xMax": maxx,
"yMin": miny,
"yMax": maxy,
},
)
out = r.json()
s.close()
except:
print("something went wrong with: " + str(i))
for i in np.aragne(10000):
time.sleep(1)
multiprocessing.Process(target=request, args=argsList[i])
Any help or insights would be greatly appreciated as I'm out of ideas.