1

I have a query that has a where clause built using a number of local variables, This however is painfully slow. Below is a rough example as I don't have access to the query currently:

declare @a varchar(50), @b varchar(50), @c varchar(50)
set @a = '%'
set @b = 'foo'
set @c = '%bar'

My where clause s something like

where a = @a and b = @b and c =@c

This take approx 1 minute to run. However, if I directly reference the values in the where clause such as:

where a = '%' and b = 'foo' and '%bar'

It takes approx 5 seconds.

So my question is, is there a better way to contruct my where clause? One important thing to note. There are about 10 local variables used in the where clause, but most are set to a default of %

Thanks in advance

hoakey
  • 998
  • 3
  • 18
  • 34

4 Answers4

6

Using local variables at WHERE filter causes FULL TABLE SCAN. Because SS does not know the values of local variables at compile time. So it creates an execution plan for the largest scale that can be avaliable for the column.

In order to prevent the performance problem, SS must know the values of the variables at compile time. Defining a SP, and passing these local varibles as a parameter is one of the solution of the problem. Another solution is, using sp_executesql and passing those local variables as a parameter again...

Or you can add OPTION ( RECOMPILE ) at the end of your sql statement to make your local varibles be compiled. This will solve the performans problem.

mehmetozakan
  • 61
  • 1
  • 1
4

No.

For constants, the optimizer can work out the best plan based on statistics for the given values.

When you use variable you are forcing parameterization and the plan will be designed to be reusable for a wide range of values.

In this case you can try OPTIMISE FOR UNKNOWN which may give better results. Or don't filter like this: use different queries for different permutations. Do you have LIKE too with leading wildcards?

John Saunders
  • 160,644
  • 26
  • 247
  • 397
gbn
  • 422,506
  • 82
  • 585
  • 676
  • What other methods would you suggest as an alternative to filtering this way? Yes the wildcard is preceded with LIKE. – hoakey Mar 08 '11 at 18:32
  • @hoakey: for 10 values with LIKE? Not really. Indexes won't be used with LIKE '%...' searches so, sorry to say, any other solution will be equally poor... – gbn Mar 08 '11 at 18:34
  • The local variables get their values from a table which is derived by parsing a csv paramter that get's passed in to the stored procedure. @FilterList = 'a,b,c,d,f,g'. I then assign each of these to the relevant local variable. Is there a better way to do this that would avoid the aforementioned issues? – hoakey Mar 08 '11 at 18:49
0

It gets messy and can lead to a new set of problems but you might want to evaluate converting this over to dynamic SQL. Basically you construct your where clause at run time based on which parameters actually have (non wildcard) values in them.

This is one of the best dynamic SQL write-ups I have found: http://www.sommarskog.se/dynamic_sql.html

And here is another where he specifically addresses dynamic where clauses: http://www.sommarskog.se/dyn-search-2005.html

Rozwel
  • 1,990
  • 2
  • 20
  • 30
0

Optimise for unknown given by gbn may work in some cases, but in other cases, OPTION RECOMPILE may be a better choice.

For simple queries on a dataset that varies between extremes, OPTION RECOMPILE gives you the best plan for each case, because it in effect plans out each query using static values (plan cost for each execution), but OPTIMISE FOR UNKNOWN causes a generic plan that may not be the best in any case, being average for all cases (cached plan).

Sample usage

select top 10 * from master..spt_values
OPTION (RECOMPILE)

There were some edge case bugs with OPTION RECOMPILE in SQL Server 2005, but it works fine in 2008.

RichardTheKiwi
  • 105,798
  • 26
  • 196
  • 262