I am using the Google Natural Language Content Classification API.
I am authenticating through a service account .json
file in a directory with the path exposed in the GOOGLE_APPLICATION_CREDENTIALS
environment variable.
There is no issue when I am running my classification script as 1 instance.
However, when I run my classification script in parallel (4,6,8,10 Docker containers running in 1 machine), I will get the below error occasionally:
[Errno 24] Too many open files: '/PATH/TO/MY-JSON_KEY.json'
I have read related issues which suggest to increase ulimit
:
Which seems like more of a way to sidestep the underlying problem.
It seems like the Google library API call might be opening the account credential file but not closing it?
UPDATE
this is a longer error message that I managed to retrieve:
google.auth.exceptions.TransportError: HTTPSConnectionPool(host='oauth2.googleapis.com', port=443): Max retries exceeded with url: /token (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 24] Too many open files'))