23

I have a very simple application using EF. But when it runs a week, the memory usage is awful (only 80MB at first, 700MB after one week). When I use dotMemory to profile my application. I find the memory of Heap generation 2 is increasing all the time.

only run 40 minutes

I Get a snapshot, finally find the retained bytes of ef dbcontext is the most.

enter image description here

I am so confused. My application is so simple. Code sample:

protected CarbonBrushMonitorEntities _entities = new MYEntities();
public void Add(HistoryData data)
{
   _entities.HistoryDatas.Add(data);
   _entities.SaveChanges();
}  

_entities only initials once at the starting time, then used all the time.

The function Add is frequently called,about 3 times/second

I google a long time, and try some methods such as:

_entities.Configuration.ValidateOnSaveEnabled = false;
_entities.Configuration.AutoDetectChangesEnabled = false;
_entities.Configuration.LazyLoadingEnabled = false;

but these do not work.

O. Jones
  • 103,626
  • 17
  • 118
  • 172
yubaolee
  • 895
  • 2
  • 8
  • 16

3 Answers3

30

If you use entity framework, you should create the context just before you need it and dispose it as soon as possible:

 using (var someContext = new SomeContext())
 {
    // your commands/queries
 }

Never keep context in memory or share it across different calls.

What I typically do is register the context with an IoC container:

 DependencyFactory.RegisterType(typeof(SomeContext));

and use a context resolver (also registered with IoC of course) like:

 using (var someContext = _contextResolver.ResolveContext())
 {
     // your commands/queries
 }    

where resolution is done like:

 public class ContextResolver : IContextResolver
 {
     public ISomeContext ResolveContext()
     {
          return DependencyFactory.Resolve<SomeContext>();
     }
 }

The EF context is actually your unit of work, which should be disposed of once you don't need it anymore.

L-Four
  • 13,345
  • 9
  • 65
  • 109
11

The other way is to clear the changetracker of the respective entities of concern or even of all the entities. This is done by changing the entity state to "Detached". This called after dbContext.SaveChangesAsync()

protected void DisposeDbset<T>() where T : class
        {
            var Tname = typeof(T).Name;
            var changetrackercollection = _unitOfWork.dbContext.ChangeTracker.Entries<T>();
            foreach (var item in changetrackercollection.ToList())
            {
                item.State = EntityState.Detached;
            }
            GC.Collect();
        }

I recently faced a similar situation where I was inserting 3,00,000 rows in batch operation. After inserting the rows, the change tracking info for all the rows remained in the memory with the entity state as Unchanged. Hence after every SaveChangesAsync() call, the changetracker accumulated.

I could not resolve new instance dbcontext for every batch, as it was a more expensive operation.

Just FYI, i had configured dbConetext.ChangeTracker.QueryTrackingBehavior = NoTracking. But this is applicable to while fetching the data.

Hopefully this is helpful. I found my solution with the help of this link http://andreyzavadskiy.com/2016/09/23/entries-in-entity-framework-changetracker-could-degrade-database-write-performance/?unapproved=19301&moderation-hash=4acc61a32ead7232959c2ec1ca268180#comment-19301

Chintan shah
  • 161
  • 1
  • 6
2

Based on Chintan shah answer I made an extension method and an example.

public static class DbContextExtensions
{
    /// <summary>
    /// Set all entries in ChangeTracker to detached to get them collected by the GC
    /// </summary>
    /// <param name="context"></param>
    public static void DetachAllEntriesInChangeTracker(this DbContext context)
    {
        try
        {
            foreach (var entityEntry in context.ChangeTracker.Entries())
            {
                entityEntry.State = EntityState.Detached;
            }
        }
        catch (Exception e)
        {
            LogManager.GetLogger(context.GetType().FullName).Error(e, "error when detaching all entries in changeTracker");
        }
    }
}
public class FooDbContext : DbContext
{
    public override void Dispose()
    {
        this.DetachAllEntriesInChangeTracker();
        base.Dispose();
    }
}
Dominic Jonas
  • 4,717
  • 1
  • 34
  • 77
  • Bad solution. If you dispose a context the problem is already gone, see the accepted answer. This `Dispose` override only takes more time. The point of this Q&A is that people should adhere to a healthy context life cycle. When these detach tricks are necessary that usually spells trouble in the life cycle area. – Gert Arnold Sep 16 '20 at 10:25
  • 3
    In my project I'm actually using a `DbContextScope` library and also `Lamar` and `AspNet` for `dependencyInjection`. But I still have that memory leak, that the `DbContext` is not removed from memory after disposing. After examining it with [dotMemory](https://www.jetbrains.com/dotmemory/), I was able to successfully solve it by using `DetachAllEntriesInChangeTracker()` in my `Dispose()`. In principle, however, I am of your opinion that this should not be necessary. – Dominic Jonas Sep 16 '20 at 13:35
  • That makes this solution highly specific. It's certainly not widely applicable and IMO not fit for a Stack overflow answer because it may encourage bad practice. Even in your case you'd do better digging deeper to get to the bottom of this memory leek. Maybe you have long-lived proxy objects that keep references to the contexts they were created with. – Gert Arnold Sep 16 '20 at 14:28
  • i am on latest update update of ef fore dotnet core 3.1. running several test background services in aspnet doing read/update/delete/dispose. dipose only did not help for sure. so killed change tracking stuff directly. not sure if AsNoTracking alone helps either. – Dzmitry Lahoda Dec 16 '20 at 04:46