I have an asp.net website that is being hosted in IIS. I have recently noticed that with large result sets being returned from the database, that the memory for IIS Worker Process just keeps growing (about 400MB each time the query is run). If a few of these large queries happen to be run at the same time it can just eat away at the memory (have seen it reach 5GB) and the server slows right down.
I have narrowed it down to a single line of code, when the data is loaded into the DataTable.
using (SqlConnection connection = new SqlConnection(connectionString))
using (SqlCommand command = new SqlCommand(storedProcedureName, connection))
using(DataTable dataTable = new DataTable())
{
command.CommandType = System.Data.CommandType.StoredProcedure;
connection.Open();
using (SqlDataReader reader = command.ExecuteReader())
{
// Memory Spikes on dataTable.Load
dataTable.Load(reader);
}
}
What I dont understand is that the memory being allocated to the DataTable doesn't seem to be disposed of as I would expect. When the DataTable falls out of scope or when the webpage is navigated away from or even when the user logs out of the site, the memory stays at the same level. This is obviously an issue in a multi user system.
I have used a memory profiler and it is holding thousands of strings in memory which are the results of the query held in the DataTable, but I am not really sure where to go from here? Am I misunderstanding how I should be handling this?