2

I am using SQL Server 2014 Enterprise in Windows Server 2012.

I have checked Min and Max Allocated Memory for SQL Server is configured properly.

Yesterday, Users ran few reports parallely multiple times and got a lot of alerts (High CPU and High Memory). From below query, I found the top CPU utilized queries:

...
...
FROM sys.dm_exec_query_stats AS qs  
cross apply sys.dm_exec_sql_text(qs.sql_handle) AS st  
cross apply sys.dm_exec_query_plan (qs.plan_handle) AS qp 
ORDER BY (total_worker_time/execution_count) desc, qs.last_execution_time DESC

I found the top 10 queries are exactly same except few changes in WHERE clause. execution_count = 1 for all queries. The queries are inserting records to a #temp table with below format:

Insert into #Temp(col1, col2,............)
Select
.
.
From
(
Select
.
.
Where (CONTAINS(val1, ' "xx1*" ') OR CONTAINS(val2, ' "yy1*" ') OR CONTAINS(val3, ' "zz1*" ')..........30-35 CONTAIN blocks)
-- here "xx*", "yy*" values are changing in each query (I guess this is due to parameter selection by end users)
) x

If these dynamic queries are the culprit, then what would be the solution to this (as Users are not going to stop running the queries)?

Can I ask to convert the report queries to SPs? How can I avoid the different execution plans that I am getting?

I manually killed the highly utilized queries, but this is not the solution.

Looking for help.

Update:

I do not see any Memory issue anymore. But the CPU still reaches to 100%.

I observed this above particular query (used in a report) is accessed by user frequently and the CPU utilization reaches to 100% when this query runs. It is happening regularly. The WHERE clause changes for each execution based on report parameter selection by the user (I don't know how exactly the report is designed or how the parameters are getting passed).

Additionally, I have checked the table statistics (everyday gets updated), Index correctly defined. The base table deals with more than 3 millions of rows(val1, val2...used in CONTAIN belong to this table).

What are the ways to tune this query?

p2k
  • 2,126
  • 4
  • 23
  • 39
  • `How can I avoid the different execution plans that I am getting?`..Can you paste few of them.. – TheGameiswar Oct 27 '16 at 08:01
  • Might be that your memory is stretched to the max and there is (virtual) memory swapping involved. You queries will not be able to benefit from cache hits, which would slow things down considerably. – TT. Oct 27 '16 at 08:47
  • You have the proper Full-Text Indexes defined on that `val` column? – TT. Oct 27 '16 at 08:48
  • @TT - I don't have much idea about purpose of the query. How can I check it? and also, How can I check `memory is stretched to the max and there is (virtual) memory swapping involved`? – p2k Oct 27 '16 at 08:52
  • @p2k just curious if you have had any luck optimizing your query. Did you manage to speed it up? – andrews Jan 18 '17 at 20:54

2 Answers2

1

apparently your SQL statements are not cached therefore you get a new execution plan each time. And looking at your code (unless there is some other chunk which you have omitted) it's probably because if the CONTAINS clause in WHERE.

My suggestion would be to convert these statements to stored procedure AND to remove CONTAINS from WHERE and do it via INNER JOIN to CONTAINSTABLE(...). See if you'll manage to get the query cached after these changes.

As to high CPU/RAM usage, you said users were running reports in parallel, so this may sound like a locking issue. Try adding the WITH NOLOCK hint to your SELECT statements and see how it goes.

You also didn't say what amount of data your report statement operates on. Are you selecting millions of records? Do you get record multiplication because of various one-to-many or similar joins? Please, comment.

UPDATE:

after OP said the query worked with tables having 3M+ of records and that there were about 30-35 Full-Text Search conditions the main suggestion would be to:

  1. Redesign the query/db structure to lower SQL complexity and the number of CONTAINS clauses.
  2. Convert this query from user query executed in real time to some nightly report, which would create a smaller pre-computed result data set shown to users.
  3. Consider using the external search engine for this task, like Solr or elasticsearch. But this will require getting familiar with how they work and you will not be able to join their search result output directly to other tables at SQL level. You will have to call those search engines from code (C# or Java, or what ever you use).
andrews
  • 2,173
  • 2
  • 16
  • 29
  • Ew `WITH NOLOCK` can result in dirty reads, missing rows, reading rows twice etc... Not something I would advise for reporting at all... Cf this: [Bad habits : Putting NOLOCK everywhere](https://blogs.sentryone.com/aaronbertrand/bad-habits-nolock-everywhere/). – TT. Oct 27 '16 at 08:37
  • 1
    I see there are 30-35 CONTAINS block like `CONTAINS (val1, ' "xx*" ') OR CONTAINS (val2, ' "xx1*" ') OR........`. The query has tables which contain more than 3 millions records. – p2k Oct 27 '16 at 09:01
  • @TT. we're not talking about putting it everywhere, check the answer text, I suggested to TRY putting it and see if there is an improvement. Don't take the bad habits tips blindly, it's the opposite side of the bad coding medal. For some reasons NOLOCK was put there by sql server developers, right? – andrews Oct 27 '16 at 09:01
  • @p2k if you have a query as complex as that you are probably hitting the max what SQL Server Full-Text Search can do for you. We had a similar issue, please see my question and the linked article here: http://stackoverflow.com/questions/39661792/are-there-any-sql-server-full-text-search-fts-performance-improvements-since-v You may want to convert to an external search engine (solr, elasticsearch) OR to redesign your query to lower the number of search conditions. – andrews Oct 27 '16 at 09:04
  • I have never ever needed a NOLOCK and I will probably never need it. There are only very rare situations where a NOLOCK makes any sense. Giving a hint to try out NOLOCK without the necessary disclaimers may put the OP on a dangerous path. Hence the disclaimer comment ;) – TT. Oct 27 '16 at 09:14
  • 1
    @TT. alright, let's drop the NOLOCK ;). I said this before the OP replied how much data he had and that there were up to 35 search conditions. Of course, having to traverse the FTS index over 3+ million records simultaneously for several users with so many search conditions will likely to cause high CPU and RAM usage. The OP need to convert the query and/or DB structure and try to use pre-computed, nightly reports for this kind of task. – andrews Oct 27 '16 at 09:20
0

I would like to suggest Enabling the Server level configuration: "Optimize for Adhoc Workloads"

sp_configure 'optimize for ad hoc workloads',1
GO
reconfigure
go
Vinay Kulkarni
  • 300
  • 1
  • 5
  • 13