6

I have an Azure Function running on a consumption plan. When the function is under heavy load, I've been getting System.InvalidOperationException with the message The timeout period elapsed prior to obtaining a connection from the pool. This may have occurred because all pooled connections were in use and max pool size was reached.

I'm using dependency injection, and so far I've been injecting my Entity Framework Core DbContext by using AddDbContextPool. Is DbContext pooling recommended for Azure Functions, or should I rather use AddDbContext?

The connection string to my SQL Server only specifies the server and authentication, meaning that connection pooling should also be enabled by default. Is connection pooling also recommended for Azure Functions?

sveinungf
  • 841
  • 1
  • 12
  • 22
  • With heavy load you mean that there are many concurrent calls? How many concurrent runs are we talking about? – Michael Jul 10 '20 at 12:29
  • The function has a Service Bus topic trigger and we have limited the number of concurrent executions to 16. – sveinungf Jul 10 '20 at 12:46
  • Do you dispose the DbContext after each run? – Michael Jul 10 '20 at 12:47
  • Not explicitly, I'm assuming the dependency injection framework takes care of that. – sveinungf Jul 10 '20 at 12:49
  • Yeah, you are right. Just tested in my own project. – Michael Jul 10 '20 at 13:36
  • I was trying to find solution for the above question and finally thought of asking the same. But, looks like this question exists. With .net core options of AddDbContextPool and Azure functions v3, even I am bit confused about managing sql connections or dbContext instances? Since there will be n number of function invocations and sometimes scaling of server instances as well. Curious to know if there is some good explanation available out there. – Vicky Jul 15 '20 at 05:02
  • 1
    I found this: https://learn.microsoft.com/en-us/azure/azure-functions/manage-connections Maybe it helps. And maybe Azure Function is not the right choose for your solution? – Michael Jul 26 '20 at 22:32

1 Answers1

1

Apparently, AddDbContextPool is not the same as connection pooling. Instead, it's a pool of dbcontexts: https://stackoverflow.com/a/48444206/5099097

Also, according to https://learn.microsoft.com/en-us/azure/azure-functions/manage-connections#sqlclient-connections, EF and EF Core implement connection pooling by default since they use ADO.NET and that's the default for ADO.NET.

Your function code can use the .NET Framework Data Provider for SQL Server (SqlClient) to make connections to a SQL relational database. This is also the underlying provider for data frameworks that rely on ADO.NET, such as Entity Framework. Unlike HttpClient and DocumentClient connections, ADO.NET implements connection pooling by default. But because you can still run out of connections, you should optimize connections to the database. For more information, see SQL Server Connection Pooling (ADO.NET).

I think your best bet is to dial down the concurrency like you already said in the comments, but in addition to that I think it's important to note that connection pooling is managed on the client side (Azure Func in this case): SQL Server: where is connection pool: on .net side or server side. So while your func app will be able to take advantage of connection pooling, each instance of the func app will have its own connection pool as it scales out. So the scalability benefits of connection pooling aren't as great as if it were one client side app managing a single connection pool.

Therefore, to get greater benefits from connection pooling per instance of your func app you should do more work per Service Bus trigger. For example, batching several queries together under the same trigger instead of 1 query per trigger. Also, if you're doing writes in other triggers, you can batch several update/insert operations together in 1 func app trigger. According to this, 42 non-query operations is the optimal size per batch in EF Core: https://learn.microsoft.com/en-us/ef/core/performance/efficient-updating#batching

But even better than that is using table value parameters for making bulk updates to hundreds/thousands of records at a time. After I made these changes my errors from hitting the connection limit went away.

Joe Eng
  • 1,072
  • 2
  • 15
  • 30