1

I'm doing a Lot of Work With EntityFramework, like millions Inserts and Updates.
However, by time it Get Slower and Slower...

I tried usign some ways to improve performance. Like:

db.Configuration.AutoDetectChangesEnabled = false;
db.Configuration.ValidateOnSaveEnabled = false;

tried too:

db.Table.AsNoTracking();

When i change all this things it really gets Faster. However Memory used start to increases and until it give me exception.

Has anyone had this situation? Thanks

Pacheco
  • 950
  • 11
  • 18
  • 3
    Are you keeping a single instance of your `DbContext` derived type alive over lots of operations? – Richard Aug 13 '15 at 14:56
  • Yes One instance..I tried too Reloading the DbContext and Diposing.. – Pacheco Aug 13 '15 at 15:00
  • 2
    You should generally be using a single context per "operation." Such as fetching user details, facilitating a logon, etc. Is there a reason you keep a single context? – Bigsby Aug 13 '15 at 15:12
  • i tried to dispose, and them re-create it. but it still give me out of memory – Pacheco Aug 27 '15 at 12:19

4 Answers4

2

The DbContext stores all the entities you have fetched or added to a DbSet. As others have suggested, you need to dispose of the context after each group of operations (a set of closely-related operations - e.g. a web request) and create a new one.

In the case of inserting millions of entities, that might mean creating a new context every 1,000 entities for example. This answer gives you all you need to know about inserting thousands of entities.

Community
  • 1
  • 1
Tim Rogers
  • 21,297
  • 6
  • 52
  • 68
  • Thanks for the answer. I tried do use Dispose() and then create a new one as you suggest.Almost ever record But it Still give the exception. – Pacheco Aug 27 '15 at 12:23
0

If you are doing only insertion and updates - try to use db.Database.SqlQuery(queryString, object).

Entity framework keeps in memory all attached objects. So having millions of them may cause a memory leak.

Alex Lebedev
  • 601
  • 5
  • 14
0

https://github.com/loresoft/EntityFramework.Extended offers a clean interface for doing faster bulk updates, and deletes. I think it only works with SQL Server, but it may give you a quick solution to your performance issue.

Updates can be done like this:

context.Users.Where(u => u.FirstName == "Firstname").Delete();

Deletes can be done in a similar fashion:

context.Tasks.Where(t => t.StatusId == 1).Update(t => new Task { StatusId = 2 });
Nick Strupat
  • 4,928
  • 4
  • 44
  • 56
0

For millions insert and Update, Everything give out of memory, i've tried all..
Only worked for me when i stop use the context and use ADO or Another Micro ORM like Dapper.

Pacheco
  • 950
  • 11
  • 18