I have gone through the Insert 2 million rows into SQL Server quickly link and found that I can do this by using Bulk insert. So I am trying to create the datatable (code as below), but as this is a huge file (more than 300K row) I am getting an OutOfMemoryEexception
in my code:
string line;
DataTable data = new DataTable();
string[] columns = null;
bool isInserted = false;
using (TextReader tr = new StreamReader(_fileName, Encoding.Default))
{
if (columns == null)
{
line = tr.ReadLine();
columns = line.Split(',');
}
for (int iColCount = 0; iColCount < columns.Count(); iColCount++)
{
data.Columns.Add("Column" + iColCount, typeof(string));
}
string[] columnVal;
while ((line = tr.ReadLine()) != null)
{
columnVal = line.Split(','); // OutOfMemoryException throwing in this line
data.Rows.Add(columnVal);
}
}
after long work I modified my code to as below but then also I am getting OutOfMemoryException at the time of adding rows into datatable
DataTable data = new DataTable();
string[] columns = null;
var line = string.Empty;
using (TextReader tr = new StreamReader(_fileName, Encoding.Default))
{
if (columns == null)
{
line = tr.ReadLine();
columns = line.Split(',');
}
for (int iColCount = 0; iColCount < columns.Count(); iColCount++)
{
data.Columns.Add("Column" + iColCount, typeof(string));
}
}
// Split the rows in 20000 rows in different list
var _fileList = File.ReadLines(_fileName, Encoding.Default).ToList();
var splitChunks = new List<List<string>>();
splitChunks = SplitFile(_fileList, 20000);
Parallel.ForEach(splitChunks, lstChunks =>
{
foreach (var rows in lstChunks)
{
string[] lineFields = rows.Split(',');
DataRow row = datatbl.NewRow();
for (int iCount = 0; iCount < lineFields.Count(); iCount++)
{
row[iCount] = lineFields[iCount] == string.Empty ? "" : lineFields[iCount].ToString();
}
datatbl.Rows.Add(row);
}
});
I can do the bulk insert for next level as the below code:
SqlConnection SqlConnectionObj = GetSQLConnection();
SqlBulkCopy bulkCopy = new SqlBulkCopy(SqlConnectionObj, SqlBulkCopyOptions.TableLock | SqlBulkCopyOptions.FireTriggers | SqlBulkCopyOptions.UseInternalTransaction, null);
bulkCopy.DestinationTableName = "TempTable";
bulkCopy.WriteToServer(data);
File contains below kind of data
4714,1370,AUSRICHTEN MASCHINELL
4870,1370,PLATTE STECKEN
0153,1900,CAULK GUN
0154,1900,NEW TERMINATOR
0360,1470,MU 186 MACCH. X LAV. S/A ASTE PS174
9113-H22,1970,MC DRILL BITS
Code need to convert this into 6 rows and 3 columns.
Is there any faster way to achieve the above functionality to read the file and create the datatable for bulk insert? So that I should not get memory out of index exception.
Thanks in advance.