0

So I have an Azure functions app written in python and quite often the code throws an error like this.

HTTPSConnectionPool(host='www.***.com', port=443): Max retries exceeded with url: /x/y/z (Caused by NewConnectionError('<urllib3.connection.VerifiedHTTPSConnection object at 0x7faba31d0438>: Failed to establish a new connection: [Errno 110] Connection timed out',))

This happens in a few diffrent functions that make https connections.

I contacted support and they told me that this was caused by SNAT port exhaustion and adviced me to: "Modify the application to reuse connections instead of creating a connection per request, use connection pooling, use service endpoints if the you are connecting to resources in Azure." They sent me this link https://4lowtherabbit.github.io/blogs/2019/10/SNAT/ and also this https://learn.microsoft.com/en-us/azure/azure-functions/manage-connections

Problem is I am unsure about how to practically reuse and or pool connections in python and I am unsure what the primary cause of exhaustion is, as this data is not publicly available.

So I am looking for help with applying their advice to all our http(s) and database connections.

I made the assumption that pymongo and pyodbc (the database clients we use) would handle pooling an reuse despite me creating a new client each time a function runs. Is this incorrect and if so, how do I reuse these database clients in python to prevent this?

The problem has so far only been caused when using requests (or the zeep SOAP library that internally defaults to using requests) to hit a https endpoint. Is there any way I could improve how I use requests. Like reusing sessions or closing connections explicitly. I am aware that requests creates a session in the background when calling requests.get. But my knowledge about the library is insufficient to figure out if this is the problem and how I could solve it. I am thinking I might be able to create and reuse a single session instance for each specific http(s) call in each function, but I am unsure if this is correct and also I have no idea on how to actually do it.

In a few places I also use aiohttp and if possible would like to achive the same thing there.

I haven't looked into service endpoints yet but I am about to.

So in short. What can I in pratice do to ensure reusage/pooling with requests, pyodbc, pymongo and aiohttp?

Gustav Eiman
  • 101
  • 2
  • 11
  • See [this answer](https://stackoverflow.com/a/34893364/4148708). TL;DR - use an HTTPAdapter with a connection pool, beware sessions are not thread safe. – evilSnobu Jan 07 '20 at 10:56
  • Sorry I might be missunderstanding you. If the default pool size is 10. Then I am by default using a HTTPAdapter with a connection pool? Do you mean it would make a difference if I changed the size? – Gustav Eiman Jan 07 '20 at 12:49
  • @evilSnobu or do you mean that I should use a global HTTPAdpter for all my Sessions? That would give them access to the same connections meaning I could reuse them when creating a new session. – Gustav Eiman Jan 07 '20 at 13:54

0 Answers0