-1

I am having a query which is searching the result from the table,if a row match a specific object it will provide the result.

So,it is a 12 letter number.

So functionality is if user provided a number as 414.it will search like '%414%'.

And if the length of number exceed 6 digits it will search like '4146144%'

as a result the query is getting so much slower as we are having 20 million records.

Can you please provide me any alternative to fix that problem.

Company is not allowing to use any API or anything else.

 SELECT  distinct TOP (15) bt_id AS [Key]
       , UPPER(rtrim(ltrim(bt_id)))+' - '+UPPER(rtrim(ltrim(bt_desc))) AS [Description]   
 FROM    fcbillto 
 WHERE   bt_id   like '%' + @SearchTerm + '%'
 OR      bt_desc like '%' + @SearchTerm + '%'

If Full text scan is the solution than can i create two different catalog for a single table because above is one of the column on which i need to search and in else condition i am searching on some other column of same table.So,i need to create two repositories.

Ayush Kalia
  • 31
  • 1
  • 7
  • is bt_desc ntext? do you really need to search for the number in the description, or may it be sufficiently accurate when you search only in the bt_id? – Cee McSharpface Feb 07 '17 at 09:29
  • both are of data type char,Bt_id char(12) and bt_desc is of char(40). As the DB was designed earlier so i can't update on design part – Ayush Kalia Feb 07 '17 at 09:32
  • 1
    proper indexing would be one way to speed things up: [this](http://www.sqlservercentral.com/blogs/dwainsql/2014/03/26/getting-an-index-seek-to-speed-up-like-string-searches), or potentially a duplicate of [this](http://stackoverflow.com/questions/1569002/how-can-i-optimize-refactor-a-tsql-like-clause) or [this](http://stackoverflow.com/questions/1388059/sql-server-index-columns-used-in-like) – Cee McSharpface Feb 07 '17 at 10:03
  • 1
    Possible duplicate of [How can I optimize/refactor a TSQL "LIKE" clause?](http://stackoverflow.com/questions/1569002/how-can-i-optimize-refactor-a-tsql-like-clause) – Cee McSharpface Feb 07 '17 at 10:05
  • 4
    @dlatikay: no amount of indexing can speed up queries with `LIKE '%414%'` searches (with a leading `%` wildcard) - those are just always going to be full table scan and thus slow ...... – marc_s Feb 07 '17 at 10:17
  • can to tell me the index structure in that case why it slows when we are having leading %. – Ayush Kalia Feb 07 '17 at 10:35
  • can to tell me the index structure in that case why it slows when we are having leading %. and can i use full text search option here.But the twist here is in the same query we are having the same like operator on some other column in else statement.So can i create more than two Full Table Scan on one table. – Ayush Kalia Feb 07 '17 at 10:43

2 Answers2

4

You have two problems that preclude the efficient use of any indexes as your query stands:

1) The OR clause. SQL cannot use an index on either bt_id or bt_desc with the OR clause as using the index to filter on one column might knock out valid rows that meet the predicate on the other column. If you had an index on both columns you could then use a UNION to get rows that meet both criteria (or if you're lucky the optimiser might do this under-the-hood) but the indexes might not be used as things stand because...

2) A LIKE with a leading % cannot seek on an index because the index is ordered on the basis of the leading characters. If it has to look for characters in the middle of the string then it cannot use the index to seek for rows. (that doesn't mean it won't scan the index because it is smaller than the table itself - so there may still be performance improvements on this basis).

So without a change to requirements it may prove difficult to see real performance improvements. If you can get the requirements changed such that the search is on leading characters only (or limit the query on another indexed column) then this will be quicker (assuming indexes on bt_id and bt_desc):

SELECT  TOP (15) bt_id AS [Key]
   ,  UPPER(rtrim(ltrim(bt_id)))+' - '+UPPER(rtrim(ltrim(bt_desc))) AS [Description]   

FROM 
(SELECT  bt_id
   , bt_desc 
 FROM    fcbillto 
 WHERE   bt_id   like @SearchTerm + '%'

UNION

SELECT  bt_id
   , bt_desc   
 FROM    fcbillto 
 WHERE   bt_desc like @SearchTerm + '%') search

ORDER BY bt_id -- I presume?

Full text indexes are designed to search for words within column(s) - you just have characters so that won't help.

strickt01
  • 3,959
  • 1
  • 17
  • 32
  • I believe that this union solution is slower than the query from the question, because the slow union before the query has eliminated the rest of the rows beyond the first 15 – t-clausen.dk Feb 07 '17 at 13:38
  • It depends on the need for an `ORDER BY`. I added one assuming it would be relevant. If one is required then the original query is going to have to scan the whole table anyway in order to ascertain which rows make up the first 15. If not then the original query may be faster depending on the where the rows meeting the criteria lie. However it's going to involve a clustered index scan which has horrendous implications for locking. – strickt01 Feb 07 '17 at 13:55
0

Try this one.

SELECT distinct TOP (15) 
  bt_id AS [Key], 
  UPPER(rtrim(ltrim(bt_id)))+' - '+UPPER(rtrim(ltrim(bt_desc))) AS [Description] 
FROM fcbillto 
WHERE charindex(@SearchTerm, bt_id) > 0 OR charindex(@SearchTerm, bt_desc) > 0 
  • Thanks Saravanan Can you please provide me the feature name. Like its not like its something else.So that i can know the pros and cons of that before implemetation and can suggest to my seniors regarding this. – Ayush Kalia Feb 07 '17 at 12:02
  • However: charindex(@SearchTerm, bt_id) > 0 does the same as url like '%@SearchTerm%', but internally they work differently somehow. For some people the LIKE expression turned out faster, for others the CHARINDEX method. Maybe it depends on the query, number of processors, operating system, whatever. It may be worth a try. Please show this to your seniors. http://cc.davelozinski.com/sql/like-vs-substring-vs-leftright-vs-charindex – Saravanan Yadav Feb 07 '17 at 12:19
  • It's an interesting link but it applies to searches over a single column. It also doesn't look to use an index (it looks at both ends of the string) so is purely testing the functions - not the use of an index to increase performance. The current problem lies with the query causing a table/clustered index scan over a 20 million row table so unless some sort of index usage can be applied there are going to be some very negative performance implications (not least locking!). – strickt01 Feb 07 '17 at 13:04