3

It seems that I was running into some random issues with a Stored Procedure when it took longer than the default timeout value; I was advised of that here.

So I increased the value, and it seems to have helped.

But it makes me wonder: Why not just make it a very high number always, just in case the query or operation takes a long time? It's not as if it's going to take that long just because that is the MAX allowed, right?

And that being the case (I imagine), why would the default value be so low (30 seconds, I believe)?

UPDATE

I originally made the SqlCommand's CommandTimeout value 300 (5 minutes), but with that I got "Context Switch Deadlock occurred." So I then reduced it to 120 (2 minutes), and that seems to be more or less the "sweet spot" for me. I did get "Timeout expired" one time out of several tests, but when I retried the same exact range, it completed successfully the second time, so I guess it's just "one of those things" - 120 will sometimes not be enough of a timeout, but 300 is apparently too much. IOW, this balancing act between too little and too much doesn't appear to be "an exact science."

Community
  • 1
  • 1
B. Clay Shannon-B. Crow Raven
  • 8,547
  • 144
  • 472
  • 862
  • Plus if there is something wrong and that makes queries run longer than normal and you have increased the timeout, you will get more contention for memory, locking etc. which may make things worse. – Dijkgraaf Nov 24 '15 at 02:44
  • 1
    If the query lock any table, it is quite normal for us to set the timeout to be short(duration that make sense). – Ppp Nov 24 '15 at 05:05

1 Answers1

4

The timeout will only limit the max possible you'll wait. It won't make quick things take longer.

Sometimes there is nothing you can do, and a feature is just ridiculous, and will take a ridiculously long time. In this case, a longer timeout is just necessary.

However, that is hopefully not always the case, because users often don't want to wait very long. And if a user stops waiting, then you might be wasting resources creating something that won't get used. Further, you might also be making the user wait as well, should they have accidentally chosen to do more than they intended.

My recommendation is to keep reasonable timeouts in place, and only extend them in the limited scenarios where it is necessary.

On a completely different topic, one might be able to change the feature to make it run faster (such as filtering to work with less data, pre-aggregating intermediary totals that are faster work with, indexes, and/or query optimization.) Sometimes an index can be the difference between 2 minutes and 2 seconds.

Greg
  • 2,410
  • 21
  • 26
  • Yes, I realize increasing the Timeout doesn't speed things up; but at a certain point (300 for me), I got, "Context Switch Deadlock occurred." Reducing it to 120, I no longer got that. In this utility, the thing that makes the user wait is their sole need for using the utility. They will wait, they can not do otherwise. I have very little control over the speed, because it is a pre-existing Stored Proc that returns the data. – B. Clay Shannon-B. Crow Raven Nov 24 '15 at 17:28
  • 2
    Speaking of the ContextSwitchDeadlock, you might be interesting in the following link: http://stackoverflow.com/questions/578357/visual-studio-contextswitchdeadlock – Greg Nov 24 '15 at 17:34
  • 1
    I hope not, I'm trying to be boring as all get out. – B. Clay Shannon-B. Crow Raven Nov 24 '15 at 17:36
  • Thanks for that link; I followed the answer, and expect marvelous results; since it is a "debugging" thing that you are switching off, does that mean that a user (running the release version of the .exe) would never see it anyway (do you know)? – B. Clay Shannon-B. Crow Raven Nov 24 '15 at 18:00