1

I've had a project dumped in my lap, and I'm pretty far outside of my element. There is a table with roughly 10,000,000 rows that is being queried. There is one query in particular that is taking too long to run, and is timing out. The query generated by Linq is

-- Region Parameters
DECLARE @p0 Int = 335
DECLARE @p1 DateTime = DATEADD(day,-7,GETDATE())
-- EndRegion
SELECT *
FROM [Events] AS [t0], 
    [Devices] AS [t1], 
    [EventTypes] AS [t2], 
    [DeviceTypes] AS [t3]
WHERE ([t1].[DeviceID] = [t0].[DeviceID]) 
    AND ([t2].[EventTypeID] = [t0].[EventTypeID]) 
    AND ([t3].[DeviceTypeID] = [t1].[DeviceTypeID]) 
    AND ([t1].[RoomID] = @p0) 
    AND ([t0].[EventTime] >= @p1)

If I change the last line to be [t0].[EventTime] >= DATEADD(day,-7,GETDATE()) instead of using the variable @p1, the query goes from taking 45 seconds to run, to taking 3 seconds to run.

My two questions are 1) Why does this simple change increase performance so drastically and 2) is there a way to force Linq to not use the variable and inline the datetime object?

The Linq code is C# .NET 3.5 and the server is MSSQL 2005

Edit: I added the SELECT * for brevity, the actual query does not do that.

Samuel Neff
  • 73,278
  • 17
  • 138
  • 182
kyork
  • 713
  • 5
  • 14

3 Answers3

3
  • DATEADD(day,-7,GETDATE()) can be optimised because it's constant
  • @p1 is a parameterisation to improved plan re-use

It's quite common to see literal vs parameter differences

The other option is that [t0].[EventTime] is smalldatetime and you have datatype precedence forcing a conversion to datetime to match @p1. With direct use of DATEADD, the optimiser will change the DATEADD expression to smalldatetime

gbn
  • 422,506
  • 82
  • 585
  • 676
  • In the context of running the statement, isn't `@p1` still effectively a constant? SQL Server knows the value is not going to change. Same with the data type difference, wouldn't SQL Server be smart enough to do the conversion only once snice `@p1` is not going to change throughout execution of the query? – Samuel Neff Jan 18 '11 at 17:50
  • @Samuel Neff: no. For this execution DATEADD(day,-7,GETDATE()) is evaluated and assigned to @p1. A later execution will mean a different value in @p1. The plan will be re-used with a plan to deal with different values. When used direct, the plan will be optimised for the value at that time. A subsequent execution means a plan recompilation. See [MSDN](http://msdn.microsoft.com/en-us/library/ms175580.aspx) or [here](http://beyondrelational.com/forums/p/4988/10353.aspx) or [here](http://stackoverflow.com/questions/510214/) – gbn Jan 18 '11 at 18:00
  • @gbn, thanks for the links. What i didn't get from your original answer is that the constant can provide a better query plan (from your second link). – Samuel Neff Jan 18 '11 at 18:14
  • 1
    Thanks for the information. I think I understand why it is slower now. Is there anyway for me to tweak the Linq statement so that it will use the constant? – kyork Jan 18 '11 at 19:07
  • @kyork: sorry, don't know. Check datatypes though in case it is a conversion issue – gbn Jan 18 '11 at 19:09
2

If I change the last line to be [t0].[EventTime] >= DATEADD(day,-7,GETDATE()) instead of using the variable @p1, the query goes from taking 45 seconds to run, to taking 3 seconds to run.

I am rather suspicious of that statement. Are you sure you flushed buffers after the swap? Did you try swapping back and it still performs as badly when using the @p1 param?

General notes on performance troubleshooting.

  1. Run both statements "set statistics io on" and "set statistics time on"
  2. Hit Ctrl-M or use toolbar to show Actual execution plan
  3. Always flush buffers and clear the plan cache between runs using "dbcc freeproccache" and "dbcc dropcleanbuffers"
  4. Line up multiple versions and run them together to see the relative COST from the execution plans
  5. If required when seeking advice, dump the text versions of the plans "set showplan_text on" and run the query in to-Text mode

The execution plan gives you a general direction of where it is going, but it is not always accurate. Balance it with the IO and CPU statistics.

RichardTheKiwi
  • 105,798
  • 26
  • 196
  • 262
2

10.0.1600 is SQL Server 2008 RTM. Try upgrading to SP2 (which is a good idea anyways!).

Often when I run into weird optimization issues I've found that the SQL Server is out of date and a newer SP resolves the issues. Not always, but often.

Samuel Neff
  • 73,278
  • 17
  • 138
  • 182