I am working on an admin site, built Using ASP.NET 4.5 (C# / WebForms) on top of a SQL Server datastore, which requires me to retrieve large recordsets (250K+ records). In development, on IIS Express, I consistently receive Out Of Memory (OOM) exceptions originating from IIS. I can look at the Task Manager and watch as the Memory usage spikes from 120,000K to over 600,000K.
The problem (the OTHER problem) I am running into is that when I go back and filter the query so it doesn't return as many records I continue to get OOM exceptions because IIS is not releasing the memory from the previous query. The memory required for the new query just piles on top of the memory already allocated.
On a full IIS 7 installation (still on my DEV box) I do not get the initial OOM exception from IIS, however it is still not releasing the memory, which will continue to cause problems for future IIS activity as the memory continues to pile up. I understand that GC should eventually get around to cleaning it up but I need it to happen immediately.
So the question is how do I free the unused, yet allocated memory from IIS on demand?
I had found the following SF articles ...
which led me to try the following ...
protected override void OnPreRender(EventArgs e)
{
try
{
var data = this.GetSessionList();
BindUI(data);
}
catch (Exception ex)
{
//
// display error message
while (ex.InnerException != null) { ex = ex.InnerException; }
String msg = (ex.GetType() == typeof(OutOfMemoryException)) ? MessagesConfigService.OutOfMemory.SessionReports : ex.Message;
ucAlert.ShowMessage(msg, ucAlertMessage.AlertStates.DANGER);
}
finally
{
//
// Clean up anything being held on to in memory. This will not prevent OOM
// exceptions for too-large queries, but will prevent further OOM errors for
// smaller queries that follow.
GC.Collect();
Int64 totalMemory = GC.GetTotalMemory(true);
}
base.OnPreRender(e);
}
In the finally clause I have tried each of the GC methods independently as well as together but have seen no reduction to the memory held by IIS.
The other thing I had not originally thought about, but as I am typing this, realize is important ... I need to clear memory but preserve Session State ... so whatever the solution is, it must not wipe out session state, which would result in the user having to log back in.
So ... any thoughts?
Thank you in advance :)
-G
P.S. The reason for such large recordsets is because the business wants to be able to export the data into an Excel spreadsheet, so simple pagination of the results is not necessarily an option. Also there is a fair amount of latency as the query executes so having to endure the execution time for each page would be prohibitive. I have questions about performance tuning this query but is obviously out the scope of this question.