-2

I'm inserting several data in our DB like:

forech ........
{
 try
 {
   //..process new item from external data
   //..assign data to the new item
   db.MyTable.Add(theNewItem)
   i++;
   if ((i % 100) == 0) db.SaveChanges();
 }
 catch (Exception e)
 {
   //ignore the error... can be duplicated row
   errors++;
 }
} //end foreach

if (db.ChangeTracker.HasChanges()) db.SaveChanges();

For performance Im saving in the DB each 100 records added. My question is:

If in the next 100 records, 1 is duplicated, this give me an error. But... the others 99 is saved correctly?

I cant verify this right now, I'm not in the office and the process is running importing the data (aprox. 1 million)

Yuri Morales
  • 2,351
  • 4
  • 25
  • 52
  • 6
    So you kick off some code, run home and ask SO to mentally debug it for you? ;-) Your actual question has been answered before. `SaveChanges()` does so in a transaction, so if one record in that transaction fails for whatever reason, none of the records will be saved. So you can run the import again next monday. – CodeCaster Sep 18 '15 at 15:31
  • 3
    Let us know when you get back to the office. –  Sep 18 '15 at 15:33
  • No. My code is saving each single record. (not 100). And the performance is a bit slow. I just read about save each 100 records to improve perf. and jump this question for me. Thanks anyway. I'll check next monday. Thanks. – Yuri Morales Sep 18 '15 at 15:37
  • Side note: keeping one context instance (`db`) defeats one of the purposes of saving in batches. You should renew the context for each batch. – Gert Arnold Sep 18 '15 at 20:12

1 Answers1

2

The answer to your question is two-fold.

First is the answer to "What does SaveChanges() do"? This method will get all the records from the change tracker, and write them to the database in a transaction. It will roll back the transaction if any error occurs during either of those processes (collecting and saving). This is documented.

Now you know this, your question actually is whether the successful operations within a rolled back transaction get committed. They don't, that's entirely what transactions are for. So if 1 record out of 100 inserts gives an error, 0 get saved.

Regarding your comment, you're actually looking for a way to bulk insert records. Look up "bulk copy", as suggested in Fastest Way of Inserting in Entity Framework. Note that this still won't work if any insert fails. Sanitize your data before attempting to insert.

Community
  • 1
  • 1
CodeCaster
  • 147,647
  • 23
  • 218
  • 272
  • @YuriMorales, don't just accept the first answer that comes along.. it is certainly possible to change the entitystate of the entries that are being tracked. review `DbContext.ChangeTracker.Entries().Where( entry => entry.State == EntityState.added);` or `DbContext.ChangeTracker.Entries().Where( entry => entry.State == EntityState.added);` – Brett Caswell Sep 18 '15 at 16:21
  • @Brett which question is that answering? – CodeCaster Sep 18 '15 at 16:22
  • @CodeCaster, indeed.. my comment doesn't answer the 'yes' or 'no' question being asked.. it addresses the inferred question, how can I omit duplicate entries when doing a SaveChanges method call on the DbContext.. – Brett Caswell Sep 18 '15 at 16:26
  • @Brett you should fix that before adding them to the change tracker. If you add a record in batch one, and the same record in batch two that won't help, neither if the duplicate is already in the database. – CodeCaster Sep 18 '15 at 16:29
  • I totally agree, I would not allow a duplicate to be added in the first place, if such a contraint exists.. whether it's buisness or system. – Brett Caswell Sep 18 '15 at 16:31