Apparently, AddDbContextPool is not the same as connection pooling. Instead, it's a pool of dbcontexts: https://stackoverflow.com/a/48444206/5099097
Also, according to https://learn.microsoft.com/en-us/azure/azure-functions/manage-connections#sqlclient-connections, EF and EF Core implement connection pooling by default since they use ADO.NET and that's the default for ADO.NET.
Your function code can use the .NET Framework Data Provider for SQL
Server (SqlClient) to make connections to a SQL relational database.
This is also the underlying provider for data frameworks that rely on
ADO.NET, such as Entity Framework. Unlike HttpClient and
DocumentClient connections, ADO.NET implements connection pooling by
default. But because you can still run out of connections, you should
optimize connections to the database. For more information, see SQL
Server Connection Pooling (ADO.NET).
I think your best bet is to dial down the concurrency like you already said in the comments, but in addition to that I think it's important to note that connection pooling is managed on the client side (Azure Func in this case): SQL Server: where is connection pool: on .net side or server side. So while your func app will be able to take advantage of connection pooling, each instance of the func app will have its own connection pool as it scales out. So the scalability benefits of connection pooling aren't as great as if it were one client side app managing a single connection pool.
Therefore, to get greater benefits from connection pooling per instance of your func app you should do more work per Service Bus trigger. For example, batching several queries together under the same trigger instead of 1 query per trigger. Also, if you're doing writes in other triggers, you can batch several update/insert operations together in 1 func app trigger. According to this, 42 non-query operations is the optimal size per batch in EF Core: https://learn.microsoft.com/en-us/ef/core/performance/efficient-updating#batching
But even better than that is using table value parameters for making bulk updates to hundreds/thousands of records at a time. After I made these changes my errors from hitting the connection limit went away.