Does anyone has had this error with the S3PrefixSensor?
OSError: [Errno 23] Too many open files in system: '/usr/local/lib/python3.6/dist-packages/botocore/data/endpoints.json'
I'm having that error when the scheduler runs over 12 tasks with that operator at the same time. If I rerun them manually, they work fine.
I tried increasing the ulimit as suggested by the answer of this question but it didn't work for me: Errno 24: Too many open files. But I am not opening files?
It's odd that error is coming up, as I'm only running 12 tasks at the same time. Is it an issue with the S3 sensor operator?