I would like to use a DataReader
considering my file has millions of lines and a DataTable
seems to be slow for loading with SQLBulkCopy
I implemented this reader, which will read my file this way:
//snippet from CSVReader implementation
Read();
_csvHeaderstring = _csvlinestring;
_header = ReadRow(_csvHeaderstring);
int i = 0;
_csvHeaderstring = "";
foreach (var item in _header)//read each column and create a dummy header.
{
headercollection.Add("COL_" + i.ToString(), null);
_csvHeaderstring = _csvHeaderstring + "COL_" + i.ToString() + _delimiter;
i++;
}
_csvHeaderstring.TrimEnd(_delimiter);
_header = ReadRow(_csvHeaderstring);
Close(); //close and repoen to get the record position to beginning.
_file = File.OpenText(filePath);
public static void MyMethod()
{
textDataReader rdr = new textDataReader(file, '\t', false);
using (SqlBulkCopy bulkcopy = new SqlBulkCopy(connectionString))
{
bulkcopy.DestinationTableName = "[dbo].[MyTable]";
bulkcopy.ColumnMappings.Add("COL_0", "DestinationCol1");
bulkcopy.ColumnMappings.Add("COL_1", "DestinationCol2");
bulkcopy.ColumnMappings.Add("COL_3", "DestinationCol3");
bulkcopy.ColumnMappings.Add("COL_4", "DestinationCol4");
bulkcopy.ColumnMappings.Add("COL_5", "DestinationCol5");
bulkcopy.ColumnMappings.Add("COL_6", "DestinationCol6");
bulkcopy.WriteToServer(rdr);
bulkcopy.Close();
}
}
However when trying to do a SQLBulkCopy
columnmapping to skip the unecessary column I get the error:
the given columnmapping does not match up with any column in the source or destination.
I have also tried to map this way to no avail.
Will this implementation of IDataReader
work for SQLBulkCopy
? I tried implementing an 'ignore column' method but that didn't work.