0

I want to export DB table to excel. DB table has approx 5 804 806 records which I want to export to an excel.

I have tried different ways but getting

Insufficent memory to continue the execution of program

When I'm exporting less data then below function is working fine but as try to pull report with 5 million records it is throwing insufficient memory error.

How can I achieve it?

Code I have used but not working:

public static void GetExcelFromDataTable(DataTable dt, string fileName)
    {
        
       string attachment = "attachment; filename=" + fileName;
       HttpContext.Current.Response.ClearContent();
       HttpContext.Current.Response.Cache.SetCacheability(HttpCacheability.NoCache);
       HttpContext.Current.Response.AddHeader("content-disposition", string.Format("attachment;filename={0}.xls", fileName));
      //HttpContext.Current.Response.ContentType = "application/vnd.ms-excel"; //Excel 2003
       HttpContext.Current.Response.ContentType = "application/vnd.openxmlformats-officedocument.spreadsheetml.sheet"; //Excel 2007
       string tab = "";
       foreach (DataColumn dc in dt.Columns)
       {
           HttpContext.Current.Response.Write(tab + dc.ColumnName);
           tab = "\t";
       }
       HttpContext.Current.Response.Write("\n");

       int i;
       foreach (DataRow dr in dt.Rows)
       {
           tab = "";
           for (i = 0; i < dt.Columns.Count; i++)
           {
               if (dr[i].ToString().Trim() == "\"")
               {
                   dr[i] = " \" ";
               }
               // HttpContext.Current.Response.Write(tab + dr[i].ToString());
               if (dr[i].ToString().StartsWith("0"))
               {
                   HttpContext.Current.Response.Write(tab + @"=""" + dr[i].ToString().Trim() + @"""");
               }
               else
               {
                   HttpContext.Current.Response.Write(tab + dr[i].ToString().Trim());
               }
               tab = "\t";
           }
           HttpContext.Current.Response.Write("\n");
       }
       HttpContext.Current.Response.End();
  

 
    }

ADO code:

if (ReportName == "SalesStatusVsMRPStatus")
            {
                query = "select Material,Material_Description,Eng_Resp,PLDv,SalesOrg,DC,Valid_from,SS,ItemCG,DelPlant,DelPlantMS,DelPlantValid_from,Grp From " + ReportName;
                //query = "SELECT TOP 10 * FROM " + ReportName;
            }
            else
            {
                query = "select * From " + ReportName;
            }
       

            SqlCommand cmd = new SqlCommand(query, cn);
            SqlDataAdapter adap = new SqlDataAdapter(cmd);
            DataTable dtUserData = new DataTable();
            cmd.CommandTimeout = 999999;
            cmd.CommandType = CommandType.Text;
            try
            {
                cn.Open();
                adap.Fill(dtUserData);
                foreach (DataColumn dc in dtUserData.Columns)
                {
                    if (dc.ColumnName == "Grp")
                    {
                        dc.ColumnName = "AuthGrp";
                    }
                }

            }
            catch (Exception ex)
            {
 
            }
            finally
            {
                cmd.Connection.Close();
                cn.Close();
                GridViewExportUtil.GetExcelFromDataTable(dtUserData, fileName);
            }

    }

how can I achieve it?

James Z
  • 12,209
  • 10
  • 24
  • 44
  • Does this answer your question? [T-SQL: Export to new Excel file](https://stackoverflow.com/questions/9086880/t-sql-export-to-new-excel-file) – Stu Dec 10 '21 at 09:23
  • 1
    According to the Excel workbook specifications and limits available [here](https://support.microsoft.com/en-us/office/excel-specifications-and-limits-1672b34d-7043-467e-8e27-269d656771c3), there is a hard limit to the number of rows at 1,048,576. – Gimly Dec 10 '21 at 09:28
  • 1
    Write a CSV file on your server, zip it, download it. Then you will move the problem on the client to read that big file. But if you need for backup purposes you can just write code to do the reverse (upload, unzip, restore) – Steve Dec 10 '21 at 09:41
  • If you really want to do this on your server, you'll need to split your processing up into batches (using [paginated queries](https://stackoverflow.com/a/2380427/9363973)) and process a sane amount of objects at a time. Of course, if your server simply doesn't have the required amount of RAM to keep the Excel file open it'll still crash, if so, then I'd recommend you set a sane maximum for a individual Excel file and also batch up excel file creation and create multiple Excel files – MindSwipe Dec 10 '21 at 09:44
  • 1
    Why are you trying to do this? You can't have more than 1M rows in a single Excel sheet. That's a hard limit. If you use PowerPivot you can put in there as much data as memory allows *BUT* that only works with Excel, the application. You could use Excel interop to load data using PowerPivot but definitely not on a web server. An Excel file with 5M rows, even split among multiple sheets, isn't very useful either. Excel is for human use, not for transferring large amounts of data – Panagiotis Kanavos Dec 10 '21 at 09:53
  • Besides, you didn't even post any Excel-generating code. If you use a library like Epplus or NPOI to create real Excel files you can load the data using a data reader and avoid caching everything in memory. You'd still end up with problems and an unwieldy file though – Panagiotis Kanavos Dec 10 '21 at 09:54

1 Answers1

0

There are 2 issues here I see:

  • Program-side memory allocation
    DataAdapater uses DataSet, and it downloads the full table before closing its connection to DB. Hence the full table will first need to be in your memory. If you don't need to edit the data on DB after you read it, you can use DataReader instead. It's read-only, forward-only, and only one record is requested every time. Thus not much memory will be consumed. I am using Oracle DB version though, you'd have to check if your DB supports DataReader in C#.

  • Excel limit
    Even .xlsx supports only 1,048,576 rows per sheet. You'll have to consider about changing sheet when you're making the excel file.

DataSet - Downloads full table, supports DB editing, can work offline after query result retrieved.

DataReader - Read-only, needs to be connected to DB all the time when it is working. Much faster retrieving too, from my own experience with C# and Oracle DB.

This post has more details: Difference between Datareader, DataAdapter, Dataset, DataView

Xiang Wei Huang
  • 336
  • 1
  • 9