So my problem is I have to run a ASP .NET application running in 32 bit. But I have to process a bunch of records and it makes a huge change in the database. In case of something fails I have to put the whole process in
a transaction.
I thought that if I use the DBContext.SaveChanges()
after every f.e. 10000 processed records and get rid of the references etc. so the GC
can do it's job can solves my issue but it's not.
The problem is that every process using the DB (I mean CRUD operations) are slowing down significantly (even 10 times slower or so).
So my code is something like this, for better understanding:
public void StartProcess()
{
localScopeCtx = cont.Resolve<IApplicationDbContext>();
IDbContextTransaction trans = localScopeCtx.BeginTransaction();
int take = 10000;
int index = 1;
while (index <= needToProcess)
{
var records = GetRecords(take)
List<obj> recordsToDelete;
ProcessRecords(take, out recordsToDelete);
localScopeCtx.Records.RemoveRange(recordsToDelete);
localScopeCtx.SaveChanges();
DetachAllEntities(localScopeCtx);
}
trans.Commit();
}
public void DetachAllEntities(IApplicationDbContext dbContext)
{
var changedEntriesCopy = dbContext.ChangeTracker.Entries()
.Where(e => e.State == EntityState.Added ||
e.State == EntityState.Modified ||
e.State == EntityState.Deleted ||
e.State == EntityState.Unchanged)
.ToList();
foreach (var entry in changedEntriesCopy)
entry.State = EntityState.Detached;
}
In the first loop everything is fine. It is fast and it is doing it's job like charm. But after one localScopeCtx.SaveChanges()
everything slows down as I said. I think that is because the EF doesn't really send the modifications to the database but stores it in the memory (according to the memory usage), but in a more clever way that it never run out of it (maybe virtual memory usage or so).
Any suggestions what can be the solution?
EDIT:
Adding the DetachAllEntities
method to the process;