4

I'm building a complex query to show some statistics results in a web view. The view can have several different filters depending on the user's choice. Also, there is the possibility to use wildcards.

I'm building this query programatically in c# using SqlParameters. So the query looks like this:

sc.CommandText = "SELECT * FROM table 
                  WHERE field1 = @filter1 
                  AND field2 LIKE @filter2"; //...and more parameters

sc.SqlParameters.Add(
   new SqlParameter("@filter1", SqlDbType.Int, 32) { Value = 1});

sc.SqlParameters.Add(
   new SqlParameter("@filter2", SqlDbType.VarChar, 446) { Value = "whatever%"});

This is a very simplified version, but the query itself is not the point. Just keep in mind that it can have different optional parameters (which I think it is a pretty common situation).

When I ran this query in Sql Manager I realized that there is a huge slow down when using parameters.So, the following two queries, that should be the same, they use a different execution plan that makes the parameterized one run a lot slower:

DECLARE @filter1 INT
DECLARE @filter2 VARCHAR 446
SET @filter1 = 1
SET @filter2 = "whatever%"

SELECT * FROM table WHERE field1 = @filter1 AND field2 LIKE @filter2

The fast version:

SELECT * FROM table WHERE field1 = 1 AND field2 LIKE 'whatever%'

Here is another example of someone with the same issue:

Why does a parameterized query produces vastly slower query plan vs non-parameterized query

Seems that there is something called parameter sniffing, that might make a parameterized query run slower, but it does not apply in my case because this is not a stored procedure.

One of the solutions proposed is to use OPTION(RECOMPILE) or OPTION(OPTIMIZE FOR). I can't do that because I have about 10 optional parameters, that may be in the filter or not, and this option is not working when using a LIKE.

So, I feel I'm in a dead end and I'm thinking on get rid of the parameters and build dynamic literal queries on the code. But then Sql Injection comes in the game.

So, do you have any other suggestions on how to solve this issue? Or do you know a safe way to escape the parameters?

EDIT: Here you can see the execution plans for a query with one parameter using LIKE:

EDIT: A more simplified representative query execution plan:

Community
  • 1
  • 1
David Espart
  • 11,520
  • 7
  • 36
  • 50
  • Parameter sniffing does not just apply to stored procedures. It applies to any query where the parameterised execution plan is cached and reused later by other invocations of the query that pass different values for the parameters. When you use literal values the Query optimizer can make certain simplifications to the query that are valid for that particular case but would not be suitable for a plan which needs to be potentially applied to any parameter value (this is different to Parameter sniffing) Can you provide an example execution plan to illustrate the exact problems you are having? – Martin Smith Jan 21 '11 at 14:24
  • @Martin I've added the execution plan. – David Espart Jan 21 '11 at 15:22
  • @despart - Which is the top arrow going into the hash match in the first plan? (I can see in the bottom plan it comes from the `like` I'm just wondering if this is reversed in the good plan) – Martin Smith Jan 21 '11 at 15:34
  • @Martin In both plans, the top arrow is the LIKE – David Espart Jan 21 '11 at 15:45
  • @Martin I've added a more simplified execution plan. – David Espart Jan 21 '11 at 16:03
  • Well in that case as far as is possible to tell from the pictures they look like basically the same plan then (except the good one is a parallel plan). `LIKE` shouldn't affect anything as it gets converted into the same range seek operation by the compute scalar as is presumably happening in the other plan. Are there any big discrepancies in actual vs estimated rows? – Martin Smith Jan 21 '11 at 16:09
  • Actual and estimated plans are the same for each query. – David Espart Jan 21 '11 at 16:15
  • Actual and estimated number of rows not plans! You can see this by mousing over the arrows on the actual plan. – Martin Smith Jan 21 '11 at 16:17
  • lol sorry...In the parameterized plan, the actual is 4 Million and the estimated is 180K whereas in the literal, both are 4 M. – David Espart Jan 21 '11 at 16:22
  • 1
    That would explain why you are getting a serial plan rather than the parallel one then. I doubt that could possibly explain a 42 times difference in performance though. I wonder whether the underestimate also means insufficient memory is granted and you are getting loads of hash warnings? (You can see this in profiler) – Martin Smith Jan 21 '11 at 16:40
  • On your bad execution plan can you click on the leftmost operator (the `SELECT`) and look at the properties window in SSMS -> Parameter List Section and see what the Parameter Compiled Value is? What is the value when you execute it? Was it compiled for a value without a leading wilcard and now you are executing it with a parameter that does have a leading wildcard? – Martin Smith Jan 21 '11 at 19:07

2 Answers2

2

Take a look at the "Estimated number of rows" property in the execution plan. With your slow version (with parameters), SQL Server is not able to make a good estimation of the rows that your query will return, because it won't evaluate the actual value of the variables in compilation time. It will just make use of the statistics to estimate the cardinality of those fields you are using as filters, and create an execution plan according to it.

My solution to a problem like this one, was creating a stored procedure with as many parameters as filters you want:

CREATE PROCEDURE your_sp @filter1 INT, @filter2 VARCHAR(446) AS
 SELECT * FROM table 
 WHERE field1 = @filter1 
 AND field2 LIKE @filter2

sc.CommandText = "your_sp";
sc.CommandType = CommandType.StoredProcedure;

sc.SqlParameters.Add(new SqlParameter("@filter1", SqlDbType.Int, 32) { Value = 1});

sc.SqlParameters.Add(new SqlParameter("@filter2", SqlDbType.VarChar, 446) { Value = "whatever%"});

connection.Open();
SqlDataReader reader = command.ExecuteReader();
olmed0
  • 390
  • 1
  • 3
  • 13
  • How would this help improve the cardinality estimates coming out of the parametrised `LIKE` branch of the plan? – Martin Smith Jan 21 '11 at 17:24
  • Check "Compile-Time Expression Evaluation for Cardinality Estimation" in http://msdn.microsoft.com/en-us/library/ms175933(v=SQL.90).aspx I think the answer is there, although the solution I gave was from practical experience rather than theoretical. – olmed0 Jan 21 '11 at 17:37
  • I'm still not really sure if I'm missing what your suggestion is here. Are you saying that if it is kept as a parameterised query but put into a stored procedure this will solve the issue? – Martin Smith Jan 21 '11 at 18:02
  • Exactly. In this way, parameters are evaluated in compile time. Look what it is said there: "Avoid the use of local variables in queries. Instead, use parameters, literals, or expressions in the query." With local variables, the optimizer use a default estimate when estimating selectivity (such as 30 percent of the total number of rows). – olmed0 Jan 21 '11 at 18:10
  • This will happen with the OP's original query though. It is still a parameterised query. – Martin Smith Jan 21 '11 at 18:52
  • Definitely true, but he said he noticed a big slowdown when running the queries in SSMS and I focused on that. Not sure about how C# "translates" and run the code, this could be seen with the SQL Profiler. I have edited my answer with the proper way of calling a stored procedure in C# according to http://msdn.microsoft.com/en-us/library/yy6y35y8.aspx Let's see if the OP can test it... – olmed0 Jan 21 '11 at 19:33
  • Ah right I hadn't noticed that. Then yes unless they are using `sp_executesql` in SSMS they are probably using local variables in these tests not parameters at all (+1) – Martin Smith Jan 21 '11 at 19:41
  • Sorry I've been unable to respond before... The parameterized queries in C# are executed using sp_executesql, so I think this solution would not apply. Anyway I'm going to try to convert one of the queries to an stored procedure and see how it performs. – David Espart Jan 28 '11 at 10:49
1

Sometimes queries requiring filtered indexes can cause problems.

I just had a situation where I had a query something like

orders.Where(x => x.Cancelled == options.isCancelled)

Where options.isCancelled was a dynamic boolean. This became parameterized in the SQL query with EFCore to something like SELECT ... FROM Orders WHERE cancelled = @param_cancelled. The database can't use filtered indexes because it doesn't know in advance what the value would be.

The solution for me was:

if (options.isCancelled) 
{
   orders = orders.Where(o => o.Cancelled == true);
}
else 
{
   orders = orders.Where(o => o.Cancelled == false);
}

This enabled the query optimizer to use a filtered index I had created:

`(WHERE IsCancelled = 1)`. 

This index greatly improved the performance.

TBH something fishy was still going on because it worked fine in SSMS even without the index but was completely timing out in C#. This trick forced two different distinct queries which SQL Server would need to find two independent query plans for. So this at least made me confident to proceed even though I still wasn't 100% sure what happened.

Simon_Weaver
  • 140,023
  • 84
  • 646
  • 689