I am trying to export a SQL Server table with 1 million rows and 45 columns to a .csv
file for the user to download via the web interface but it takes so long that I eventually have to stop the process manually.
I use a SqlDataReader
and write into the file as the reader reads to avoid memory problems. The code works for small tables (less than 3k rows) but the large one keeps running and the destination file stays at 0 KB.
using (spContentConn) { using (var sdr = sqlcmd.ExecuteReader())
using (CsvfileWriter)
{
DataTable Tablecolumns = new DataTable();
for (int i = 0; i < sdr.FieldCount; i++)
{
Tablecolumns.Columns.Add(sdr.GetName(i));
}
CsvfileWriter.WriteLine(string.Join("~", Tablecolumns.Columns.Cast<DataColumn>().Select(csvfile => csvfile.ColumnName)));
while (sdr.Read())
for (int j = Tablecolumns.Columns.Count; j > 0; j--)
{
if (j == 1)
CsvfileWriter.WriteLine("");
else
CsvfileWriter.Write(sdr[Tablecolumns.Columns.Count - j].ToString() + "~");
}
}
I used the same answer recommended in this thread but still doesn't work. Please help. export large datatable data to .csv file in c# windows applications