I have the following scenario (exception handling removed, code simplified):
SqlDataReader rdr = executeQuery(query);
while (rdr.Read()) {
//There are thousands of records, so this happens thousands of times
handleRecord(rdr); //Takes a couple of seconds
}
rdr.Close();
So, the table (running on an overcrowded Sql Server 2000) the query involves has shared locks for a couple of hours or more, depending on the amount of records.
These locks sometimes affect other applications querying this database so I've been asked to remove the locks as soon as possible.
So, short of the obvious
List<Record> rList = new List<Record>();
SqlDataReader rdr = executeQuery(query);
while (rdr.Read()) {
//There are thousands of records, so this happens thousands of times
storeRecord(rList,rdr); //Takes milliseconds
}
rdr.Close();
foreach (Record r in rList) {
handleRecord(r);
}
that would put a limit on the amount of records I can handle on the machine's memory, is there any other alternative?
(This app I'm slowly moving to Dapper.NET, so it's already a dependency for other parts of the code, in case there's something in Dapper that could help with this scenario.)