1

Hoho there.

I was just trying to enhance the performance of my application (.NET 3.5, C#) with the use of stored procedures.

So I wrote a little test-app to see how much faster these are in comparison to normal queries which looks like this:

    private static long _StartStored;
    private static long _EndStored;
    private static long _StartNormal;
    private static long _EndNormal;
    private static TimeSpan _StoredTime;
    private static TimeSpan _NormalTime;
    private static string[] _Stored = new string[102000];
    private static string[] _Normal = new string[102000];
    static void Main(string[] args)
    {
        Console.WriteLine("Querying 2000 normal queries");
        _SQLConnection = new SqlConnection(/*my_connection*/);
        _SQLConnection.Open();
        _StartNormal = DateTime.Now.Ticks;
        for (int i = 100000; i <= 102000; i++)
        {
            DataTable _ResultDataTable = new DataTable();
            SqlDataAdapter _SQLAdapter = new SqlDataAdapter(/*my_query*/, _SQLConnection);
            _SQLAdapter.Fill(_ResultDataTable);
            if (_ResultDataTable.Rows.Count > 0)
                _Normal[i] = _ResultDataTable.Rows[0]["row"].ToString();
        }
        _EndNormal = DateTime.Now.Ticks;
        _NormalTime = TimeSpan.FromTicks(_EndNormal - _StartNormal);
        Console.WriteLine("Total execution time: " + _NormalTime.ToString());
        //-----------------------------------------------------------------------------
        Console.WriteLine("Querying 2000 stored procedures");
        _StartStored = DateTime.Now.Ticks;
        SqlCommand _Cmd = new SqlCommand(/*my_sp*/, _SQLConnection);
        _Cmd.CommandType = CommandType.StoredProcedure;
        SqlParameter _Param = new SqlParameter("@param1", 0);
        _Cmd.Parameters.Add(_Param);
        for (int i = 100000; i <= 102000; i++)
        {
            _Cmd.Parameters["@param1"].Value = i;
            SqlDataReader _Reader = _Cmd.ExecuteReader();
            while (_Reader.Read())
            {
                _Stored[i] = _Reader["StartWork"].ToString();
            }
            _Reader.Close();
        }
        _EndStored = DateTime.Now.Ticks;
        _StoredTime = TimeSpan.FromTicks(_EndStored - _StartStored);
        Console.WriteLine("Total execution time: " + _StoredTime.ToString());

I'd love to shorten that code, but well... doesn't work :D

TL;DR - 2000 stored procedures of the same query are only about 4 seconds faster, which seems quite low to me?

Am I using the stored procedures wrong?

F.P
  • 17,421
  • 34
  • 123
  • 189
  • that depends on how long the whole process takes. if the first took 5s and the second 1s, then i'd say it's quite an improvement! – darasd Nov 25 '09 at 13:30
  • 1
    I realize this is test code, but `SqlDataReader`, `SqlConnection` and `SqlCommand` all implement `IDisposable`, and should be disposed of at some point. – Fredrik Mörk Nov 25 '09 at 13:30
  • Well the test process took about 30 seconds, but the final application will do around 10.000 of such queries – F.P Nov 25 '09 at 13:32

6 Answers6

10

Most of the overhead will be setting up the stored procedure call and retrieving the result -- which in your example is done 2,000 times.

You may want to consider moving the loop into the stored procedure, then calling the stored procedure once and getting all the results in one go.

Jeremy McGee
  • 24,842
  • 10
  • 63
  • 95
  • Specifically, the network latency will be the bottleneck, and there is little you can do about it other than only do it once. – cjk Nov 25 '09 at 13:32
  • @ck - I agree completely that network latency is the problem here, but I found in our network that establishing a connection with the server alias (svrname) is significantly slower than the fully qualified name (svrname.net.work). The effect is huge on a loop with thousands of connections. So, there are some things you could try. :) – Mayo Nov 25 '09 at 13:38
6

I agree with Jeremy. One of the better reasons for using a stored procedures is to avoid circumstances where you call the database an unpredictable number of times within a loop. You should move the loop to the stored procedure itself and call it once. Therefore, your example shows a poor usage of stored procedures. A bit like racing a tractor and a Ferrari across a ploughed field, and then claiming that the Ferrari is slow.

darasd
  • 2,899
  • 3
  • 26
  • 39
  • Okay, good point. I thought so too, but in that case, wouldn't it even be easier to create a "View" and query that? (Using MSSQL-Server) – F.P Nov 25 '09 at 13:44
  • quite possibly. it really depends on the complexity of the logic, query plans etc. – darasd Nov 25 '09 at 14:02
  • This wasn't the first answer, but I did like the Ferrari/Tractor analogy. :) – Mayo Nov 25 '09 at 14:09
3

With the new versions of SQL Server the differences between SP and normal code are very very tiny, SQL Server optimize the exceution of any query expecially if command doesn't change.

ema
  • 5,668
  • 1
  • 25
  • 31
3

One BIG trip to the db is almost always faster than the same trip done in a loop.

You will greatly enhance the performance if you were to do tyhe loop in the db itself.

Adriaan Stander
  • 162,879
  • 31
  • 289
  • 284
0

You might find this Q&A useful: Are Stored Procedures more efficient, in general, than inline statements on modern RDBMS's?

Community
  • 1
  • 1
gkrogers
  • 8,126
  • 3
  • 29
  • 36
0

You should always use parameterized queries and as such you might not see any speed difference at all.

Jonas Elfström
  • 30,834
  • 6
  • 70
  • 106