I am inserting a large amount of data. I think it is a good idea to wrap multiple INSERT statement into "batches", so I used Transaction to send to the server once every thousands INSERT statement:
using (var reader = selectCommand.ExecuteReader())
{
var rowCount = 0;
using (var transaction = mySqlConnection.BeginTransaction())
{
// Build the params used for insert queries
var paramCount = reader.FieldCount;
var queryParamBuilder = new StringBuilder();
for (int paramNo = 0; paramNo < paramCount; paramNo++)
{
queryParamBuilder.Append($"@p{paramNo},");
}
// Remove final comma ,
queryParamBuilder.Remove(queryParamBuilder.Length - 1, 1);
var queryParam = queryParamBuilder.ToString();
while (reader.Read())
{
using (var insertCommand = new MySqlCommand($"INSERT INTO {table.Name} VALUES({queryParam})", mySqlConnection))
{
insertCommand.Transaction = transaction;
for (int i = 0; i < paramCount; i++)
{
insertCommand.Parameters.AddWithValue($"@p{i}", reader[i]);
}
insertCommand.ExecuteNonQuery();
}
rowCount++;
}
transaction.Commit();
}
I thought Transaction would help, but no, I realize data are being sent to the server every time insertCommand.ExecuteNonQuery()
is called (although, correctly, the data is not there yet until Commit() is called). The reason is because, if there is a PRIMARY KEY or FOREIGN KEY error there, an exception is immediately thrown at the ExecuteNonQuery()
call, not Commit()
call.
Is there a way to send data to server only once? Will TransactionScope
work for this case?