10

We have runned into a problem with an sp.

We have a pretty simple sp containing a declared table and a couple of outer joins that in the end returns between 20 and 100 rows.

Since querying this sp has been giving us poor performance both in production and in testenvironment we recently rewrote it to be more efficient and has tested thouroughly with great performance in our testenvironment.

We released it to production just to find out that it is still very slow and are causing our .NET 2.0 application to timeout when it is called.

We understood nothing and went into Management Studio on the production database and ran the sp there, it executes under 1 sec.

That is, when ran from our application it is extremly slow and causes timeouts, when ran from Management Studio it is very quick and never takes more then a second.

Anyone with good knowledge of SQL Server 2005 that can give us a hint regarding this?

gbn
  • 422,506
  • 82
  • 585
  • 676
Mattias
  • 111
  • 1
  • 5
  • When you say you test the SP in management studio, are you calling the SP using EXEC and providing some parameters or are you just using the query body from inside the SP? – AnthonyWJones Jan 22 '09 at 10:22
  • Did you test in SSMS with the same user the app uses to connect? – devio Jan 22 '09 at 10:29

6 Answers6

11

I think that your problem might be "Parameter sniffing". It is a process when SQL Server's execution environment "sniffs" the sp's parameter values during compilation or recompile to generate faster execution plans. But sometimes it gets a combination of parameters which together with the current data the sp will return makes a really slow sp.

There are a couple of good explanations out there. Search on Stackoverflow. This is one is good: http://omnibuzz-sql.blogspot.com/2006/11/parameter-sniffing-stored-procedures.html

One possible solution is to create local variables in the sp and set the incoming parameters values to them. Then use only the local variables in the sp.

CREATE PROCEDURE [dbo].spTest
  @FromDate as DATETIME
AS
BEGIN
  DECLARE @FromDate_local as DATETIME
  SET @FromDate_local = '2009-01-01'

  SET @FromDate_local = @FromDate
  ...
  SELECT * FROM TestTbl WHERE FromDate >= @FromDate_local
END
Christian80
  • 449
  • 1
  • 6
  • 17
  • THANK YOU! I ran into this exact problem and your solution worked beautifully! –  Jul 02 '12 at 15:00
3

Recompile is a blunt instrument. It's most likely parameter sniffing

See this question: Stored Procedure failing on a specific user

Community
  • 1
  • 1
gbn
  • 422,506
  • 82
  • 585
  • 676
1

Thanx for the replies guys, seems as running sp_recompile solved the problem, at least everything has been running smothly since I executed it yesterday afternoon, will keep watching it and see if it stays quick.

Don't however understand that recompile wasn't made when I changed the content inside the sp?

Mattias
  • 111
  • 1
  • 5
0

Make sure your production database has up-to-date stats and the indexes are in good condition (if possible consider rebuilding the indexes involved).

AnthonyWJones
  • 187,081
  • 35
  • 232
  • 306
0

Can you be sure that there is not a deadlock situation occurring? The run from management studio would be isolated where as from the application this might be part of a bigger transaction.

Chris Simpson
  • 7,821
  • 10
  • 48
  • 68
-1

I recommended that you should change your preferences in SSMS, so that you by default connect with ARITHABORT OFF to avoid this kind of confusion. But there is actually a small disadvantage with having the same settings as the application: you may not observe that the performance problem is related to parameter sniffing. But if you make it a habit when investigating performance issue to run your problem procedure with ARITHABORT both ON and OFF, you can easily conclude whether parameter sniffing is involved.

look at these links: https://www.sommarskog.se/query-plan-mysteries.html#sniffinfo

https://learn.microsoft.com/en-us/sql/t-sql/statements/set-arithabort-transact-sql?view=sql-server-ver15