Using C# and .NET 3.5 and with either a ADO connection or OLEDB connection, filling a DataTable or DataSet with 700k rows of 3 columns each, takes over 15 seconds.
Executing the actual select statement on the DB takes less than a second. The DB is on a different machine to the one querying it, processing the data. (Perhaps this adds time?)
The data looks like this:
public class Node
{
public DateTime Timestamp;
public float Value;
public string Name;
}
Doing it with a SqlDataReader and calling reader.Read()
, then manually putting the data in a new instance of the above class, adding it to a List<Node>
also takes over 15 seconds.
Code looks like this:
List<Node> data = new List<Node>();
while (reader.Read())
{
Node n = new Node();
n.Timestamp = (DateTime)reader["Timestamp"];
n.Value = (float)reader["Value"];
n.NodeName = (string)reader["NodeName"];
data.Add(n);
}
I measured this using a StopWatch
class in release mode with optimization turned on in project properties.
I get that it has to iterate each record, but I would have expected any machine today to be able to iterate 700k records in a few seconds, but not more.
What could be the reasons this takes over 15 seconds? Am I unreasonable to expect that this should be much faster?
EDIT Doing SqlDataReader.Read() by itself also takes over 15 seconds.