-2

An Oracle Stored Procedure runs in the database and returns some rows of data, taking 30 sec. Now, calling this procedure and filling the DataAdapter to then populate the DataSet takes 1m40s in a C# .NET application.

Testing, noticed that using a DataReader and reading secuentially with the Read() function after calling the stored procedure, takes again a total time of 1m40s aprox.

Any idea what could be causing these bottlenecks and how to get rid of them? Thanks in advance!

Edit: Added the code

        OracleConnection oracleConnection = Connect();
        OracleCommand oracleCommand = CreateCommand(2);
        OracleDataReader oracleDataReader = null;

        if (oracleConnection != null)
        {
            try
            {
                if (oracleConnection.State == ConnectionState.Open)
                {
                    oracleCommand.Connection = oracleConnection;
                    oracleDataReader = oracleCommand.ExecuteReader();
                    DateTime dtstart = DateTime.Now;
                    if (oracleDataReader.HasRows)
                    {
                        while (oracleDataReader.Read())
                        {
                             /* big Bottleneck here ... */

                            // Parse the fields
                        }
                    }
                    DateTime dtEnd = DateTime.Now;
                    TimeSpan ts = new TimeSpan(dtEnd.Ticks - dtstart.Ticks);
                    lblDuration2.Text = "Duration: " + ts.ToString();
                    Disconnect(oracleConnection);
                }
  • How many rows is returning? – Claudio Redi Aug 08 '14 at 14:49
  • Yeah. Smells like "I expect hundreds of thousands of rows to transfer in no time over a 1gigabit link". – TomTom Aug 08 '14 at 14:50
  • 30000 rows aprox. Tried returning less rows but found the same bottleneck un .NET – Bernardo Wilberger Aug 08 '14 at 14:52
  • Could be interesting to see how do you use the reader – Steve Aug 08 '14 at 14:56
  • 1
    Show the code for this bottleneck you found? – paparazzo Aug 08 '14 at 15:09
  • When you say that you read sequentially, does this mean that you used the `cmd.ExecuteReader(CommandBehavior.SequentialAccess)`? I have already noticed times where setting the command `cmd.CommandTimeout` to a higher value may significantly increase the performance. – Will Marcouiller Aug 08 '14 at 15:27
  • I meant that i did the basic DataReader.Read() for each record and found the bottleneck inbetween each read. – Bernardo Wilberger Aug 08 '14 at 15:30
  • 1
    Your edit is not very helpful. This is more or less standard code when you work with a DataReader. The problem could be in how do you read your values, for example in case of nulls. – Steve Aug 08 '14 at 15:31
  • And if you want to measure performances use a Stopwatch instance. But put it inside the loop to measure if some record is causing the slowness. – Steve Aug 08 '14 at 15:33
  • The parsing of each fields is crossed out, it's definitely what goes inside Read() – Bernardo Wilberger Aug 08 '14 at 15:36
  • 2
    What do you expect us to tell you when you have identified the bottleneck in your code and then purposely *not* posted that code? – Lasse V. Karlsen Aug 08 '14 at 16:03
  • Sorry, didn't mean to imply the bottleneck was part of that body. What is taking very long is the Read() itself. – Bernardo Wilberger Aug 08 '14 at 17:49
  • From the code sample you have posted you are taking the timing of not only the read but the parsing of the data. Have you run other tests where you are measuring the time solely around the read. I am inclined to believe that your issue is going to be in the actual parsing and loading of your own collections, not the time it is taking for the record to return. – pstrjds Aug 08 '14 at 18:02
  • You have got to be kidding. /* big Bottleneck here ... */ A comment statement is the bottle neck. – paparazzo Aug 08 '14 at 19:34

1 Answers1

1

This might help, though the lack of information on how you're actually using the reader.

using (var cnx = Connect()) 
    using (var cmd = CreateCommand(2)) {
         try {
             if (cnx.State == ConnectionState.Close) cnx.Open()
             
             // The following line allows for more time to be allowed to
             // the command execution. The smaller the amount, the sooner the
             // command times out. So be sure to let enough room for the 
             // command to execute successfuly
             cmd.CommandTimeout = 600; 

             // The below-specified CommandBehavior allows for a sequential
             // access against the underlying database. This means rows are 
             // streamed through your reader instance and meanwhile the 
             // program reads from the reader, the reader continues to read 
             // from the database instead of waiting until the full result 
             // set is returned by the database to continue working on the 
             // information data.
             using (var reader = cmd.ExecuteReader(
                                          CommandBehavior.SequentialAccess)) {
                 if (reader.HasRows)
                     while (reader.Read()) {
                         // Perhaps bottleneck will disappear here...
                         // Without proper code usage of your reader
                         // no one can help.
                     }
             }
         } catch(OracleException ex) {
             // Log exception or whatever, 
             // otherwise might be best to let go and rethrow
         } finally {
             if (cnx.State == ConnectionState.Open) cnx.Close();
         }
    }

For more detailed information on command behaviours: Command Behavior Enumeration.

Directly from MSDN:

Sequential Access

Provides a way for the DataReader to handle rows that contain columns with large binary values. Rather than loading the entire row, SequentialAccess enables the DataReader to load data as a stream. You can then use the GetBytes or GetChars method to specify a byte location to start the read operation, and a limited buffer size for the data being returned.

When you specify SequentialAccess, you are required to read from the columns in the order they are returned, although you are not required to read each column. Once you have read past a location in the returned stream of data, data at or before that location can no longer be read from the DataReader. When using the OleDbDataReader, you can reread the current column value until reading past it. When using the SqlDataReader, you can read a column value only once.

As for increasing the CommandTimeout property, have a look at this post:

Increasing the Command Timeout for SQL command

When you expect the command to take a certain amount of time, one shall require a longer timeout and allow for the command to return before it times out. When a timeout occurs, it takes a few seconds to be resume from it. All this may be avoided. You might want to measure the time required for the timeout and specify it as close as possible to the real command timeout requirement, as a longer timeout might incur some other underlying problems which won't be detected with a too long timeout. When a command timeout occurs, ask yourself how you could deal with a smaller result set, or how you could improve your query to run faster.

Community
  • 1
  • 1
Will Marcouiller
  • 23,773
  • 22
  • 96
  • 162