26

I have a list of entities and I want to insert them into a database. If the entity already exists in the database as is then it needs to be skipped. If its in the database but with different values then it needs to be updated.

Is there any way to do this other than do a db call per item?

My plan is to try an insert, if a unique constraint exception on the key is thrown then do an update.

Ian Warburton
  • 15,170
  • 23
  • 107
  • 189
  • possible duplicate of [Fastest Way of Inserting in Entity Framework](http://stackoverflow.com/questions/5940225/fastest-way-of-inserting-in-entity-framework) – Chris Moschini Jan 04 '14 at 20:16
  • Here is very good answer. [http://stackoverflow.com/questions/5940225/fastest-way-of-inserting-in-entity-framework][1] [1]: http://stackoverflow.com/questions/5940225/fastest-way-of-inserting-in-entity-framework – Altaf Patel Nov 17 '14 at 16:47
  • Telerik DataAccess can do bulk updates – Evgeny Gorbovoy May 04 '18 at 16:01

4 Answers4

25

Just don't use Entity Framework in this case. Just use a stored procedure (how to depends on the version/approach you use with EF, you might will have to extend your DbContext or add a mapping from the entity model).

If you're using SQL Server, then in your store procedure, do use the MERGE command that efficiently does exactly what you need: insert if it doesn't exist, or update if it does. Everything in a single, efficient SQL query.

ken2k
  • 48,145
  • 10
  • 116
  • 176
  • It can't all be in a single query if using SqlBulkCopy. It would need to be a sp to create the temp table and then a sp to do the merge with the .net stuff in the middle. – Ian Warburton Jun 24 '13 at 11:56
  • @IanWarburton Why would you need `SqlBulkCopy`? The SQL `Merge` command does all you need. You could call a stored procedure by passing your entities as parameter (table valued parameter for example), and `Merge` will do all the job. – ken2k Jun 24 '13 at 11:59
  • I got it working but now they say it needs to run on SQL Server 2000. Any ideas? – Ian Warburton Jun 25 '13 at 15:28
  • 38
    Urrghhhhhh I hate that this is the answer. – Brandon Oct 13 '15 at 15:27
  • 1
    And now you've tightly coupled your code to a specific database implementation. – Auspex Jan 24 '20 at 16:07
11

EF isnt suited to BULK inserts. For 1000s of records it ok, but large numbers (100k plus) its slow.

If you are planning to use EF.

  • try AddOrUpdate method , (instead of insert/Update)
  • Disable tracking,
  • commit every 1000 records or fewer.

eg

Context.Set<TPoco>().AddOrUpdate(poco);
//...
Context.Configuration.AutoDetectChangesEnabled =
//..
Context.SaveChanges();

If copying unrelated data you can try those tables in parallel (doh)

phil soady
  • 11,043
  • 5
  • 50
  • 95
  • I think it would be awkward if there was a unique constraint exception in the middle of a batch. – Ian Warburton Jun 24 '13 at 11:49
  • If you are likely to have many unique constraint violations, bulk commit is less useful indeed. Actually the comment around save packets was to suggest you keep them small, max 1000 Not too large. Ef doesnt cope well with a full belly. Use Single record commits if the throughput is ok. You can also try to use smaller packets, eg 10 and if it fails retry single recs for the failed packet/s . Just various things i try to get better throughput. – phil soady Jun 24 '13 at 12:06
9

I have made an extension for that https://efbulkinsert.codeplex.com/

and it is really simple to use

using(var context = new MyDbContext())
{
    context.BulkInsert(hugeCollectionOfEntities);
}
maxlego
  • 4,864
  • 3
  • 31
  • 38
1
  1. Create a temp table: SqlCommand(string.Format("SELECT TOP 0 * INTO {0} FROM {1}...

  2. Bulk insert data into it -- Entity Framework Extended mentioned above would need to be tweaked to support the temp table name but otherwise is on the right track -- or roll a bit of code and use SqlBulkCopy.

  3. Construct a MERGE statement.

If you mine a property list you can make (2) and (3) generic. I can read and merge 150,000 rows in about 20 seconds.

Jeff Dunlop
  • 893
  • 1
  • 7
  • 20