I have dynamic query which might programatically generate more than a thousand where clauses, each of them could be like (column a = '1' and b = '2' ) with little variations. May I know whether so many conditions in one Sql query will have performance issue? column a is a varchar max , no index created. the table it query against is large and partitioned by another column c . Thanks very much. And it is Sql server 2012.
3 Answers
Yes. A thousand where
clauses can impact performance. If you could, I would recommend creating a temporary table with the values and using a join
for the logic. You should create an index on the temporary table.
Note: you could use a trick on the main table, by adding a computed column that concatenates the column values together. Then use in
:
alter table t add ab as (a + ':' + b);
create index idx_t_ab on t(ab);
Then modify the query to be:
where ab in ('1:2', . . . )
This would not work well for all applications, but it might fit your needs for performance.

- 1,242,037
- 58
- 646
- 786
-
Thanks for the input, it will be good if we can have a quantified performance indication from authorized party, otherwise it might be hard to convince people – Jia Ji Chen Mar 11 '15 at 23:55
This will be inefficient, but there are also limits on query size and number of parameters that could cause errors in your application if you build queries this way. These limits are not always clear, depending on a number of factors, so it could lead to subtle errors that are hard eliminate.
See this question for a discussion.

- 1
- 1
-
Thanks for the reply, thst is for sql 2012. I coukd not find a proof but I was told 2012 has improved about the limit. is there any Quantified measurement can be used to show customers the impact? – Jia Ji Chen Mar 11 '15 at 08:14
Try combining where-clauses in your dynamic query. When the pair of (a,b) values are sorted, you can generate a query like
where a='1' and b in ('2', '5', '7', '8')
or a='2' and b in ('1', '5', '9')

- 19,067
- 2
- 23
- 43