I'm inserting several data in our DB like:
forech ........
{
try
{
//..process new item from external data
//..assign data to the new item
db.MyTable.Add(theNewItem)
i++;
if ((i % 100) == 0) db.SaveChanges();
}
catch (Exception e)
{
//ignore the error... can be duplicated row
errors++;
}
} //end foreach
if (db.ChangeTracker.HasChanges()) db.SaveChanges();
For performance Im saving in the DB each 100 records added. My question is:
If in the next 100 records, 1 is duplicated, this give me an error. But... the others 99 is saved correctly?
I cant verify this right now, I'm not in the office and the process is running importing the data (aprox. 1 million)