I load a datatable which has aprox. 60.000 rows.
I loop these, and write them our like this to build a file:
HttpContext.Current.Response.Write(lineItem);
HttpContext.Current.Response.Write(Environment.NewLine);
When I iterate ~15-25.000 rows it works within 5 seconds, and the file is generated perfectly. However, there seems to be a limit where it suddenly takes a long time and then times out (?).
I get the error:
System.Threading.ThreadAbortException: Thread was being aborted.
at System.AppDomain.GetId() at System.Threading.Thread.get_CurrentCulture() at System.String.IndexOf(String value) at Modules.DownloadReports.RemoveTags(String s) at Modules.DownloadReports.WriteProductData(DataTable products)
My timeout in the web.config is 3600 seconds, both for the application and SQL calls.
My code:
private void WriteProductData(DataTable products)
{
StringBuilder bld = new StringBuilder();
try
{
//Column names
const string str = "OrderId; ProductId; ProductName; SeriesName; BrandName; PrimaryCategory; Content; ContentUnit; Quantity; ListPrice; PlacedPrice; Status";
bld.AppendLine(str);
bld.AppendLine(Environment.NewLine);
//Data
string trackingNumber, productId, productName, seriesName, brandName, prodCategory, content, contentUnit, quantity, listPriceUnit, placedPriceUnit, status;
string lineItem;
string errorString = string.Empty;
foreach (DataRow product in products.Rows)
{
// initialize all the different strings
lineItem = string.Format("{0};{1};{2};{3};{4};{5};{6};{7};{8};{9};{10};{11};",
trackingNumber, productId, productName, seriesName, brandName,
prodCategory, content, contentUnit, quantity, listPriceUnit,
placedPriceUnit, status);
//bld.AppendLine(lineItem);
//bld.AppendLine(Environment.NewLine);
HttpContext.Current.Response.Write(lineItem);
HttpContext.Current.Response.Write(Environment.NewLine);
}
}
catch (Exception ex)
{
Logger.ErrorFormat(ex, "An exception occured during writing of product data to BI report");
throw;
}
finally
{
//Response.Write(bld.ToString());
}
}
What have I tried...
I've optimized the loop as much as possible, so the only thing left is building a string, concatinating by string.format.
I've also tried to use a Stringbuilder to build the big string, and then write it out in the end.
So my questions are...
- Why do I even get the Threas was being aborted exception? It takes aprox 75 seconds before it gives me the error, not 5 min
- ~15-25.000 rows take 5 seconds, so my logic says 60.000 should take max 15 seconds. What could go wrong here?
UPDATE:
I solved the problem by removing the cause of the long operation. There was a string operation which was quite slow.. And when it was called 3 times pr row, it made everything extremely slow.
However, that's not the root of the problem.. That's just a pragmatic fix which will work, until the data load is very bigger.