I am trying to do large inserts using entity framework, so to reduce context bloat I am trying to recreate the context every x loops. However, after recreating the context the data from the previous context is still in the local information. I have tried calling Dispose() on the context and setting it to null before recreating it.
I found this question asked before where someone seems to have the same issue, but there was no solution found. Entity Framework Context - Recreating Object keeps old object in memory
ApplicationDbContext context = new ApplicationDbContext();
context.Configuration.AutoDetectChangesEnabled = false;
for (int i = 0; i < products.Count; i++)
{
context.Product.Add(products[i]);
if(i % 50 == 0)
{
context.SaveChanges();
context.Dispose();
context = null;
context = new ApplicationDbContext();
context.Configuration.AutoDetectChangesEnabled = false;
}
}
context.SaveChanges();
For clarification, this is what I mean when the local data is not cleared. The local data for the Products has 151 entries locally, despite having the context be recreated every 50 entries. This has lead to the products being added multiple times to the database.
Just to clarify what I am asking, how can I recreate the context so that it does not have an data left over from the previous context? Also, if anyone could explain what causes the data to be retained to begin with that would be nice to understand.
Edit: I have tried restructuring the code as suggested by by Gert Arnold in the comments. Here is the current version.
public void BatchAddProducts(List<ProductModel> products)
{
int loops = (int)Math.Ceiling((double)(products.Count / 50));
List<ProductModel> sublist = new List<ProductModel>();
for (int i = 0; i < loops; i++)
{
int toPull;
if((i * 50 + 50) > products.Count)
{
toPull = products.Count - (i * 50);
}
else
{
toPull = 50;
}
sublist = products.GetRange(i * 50, toPull);
ProductAdd(sublist);
}
}
private void ProductAdd(List<ProductModel> products)
{
using(ApplicationDbContext context = new ApplicationDbContext())
{
context.Product.Local.Clear();
context.Product.AddRange(products);
context.SaveChanges();
}
}
This still has the same issue. The context is retaining information from the version that should no longer exist, and entries that were added in the previous loop are being added to the database again. i.e. after 2 iterations 150 entries are added to the database, and after 3 there are 300 new entries in the database.
In response to the answer by Steve Py, I am checking the context.Product.Local information just before SaveChanges() is called. The duplicate data I mention is duplicates of the products that are in the context. So to refer to the example I made earlier, after 3 loops, there are 3 versions of a product from the 1st iteration, 2 from the 2nd, and 1 from the 3rd. I dont believe it is an issue with how the product list is being generated. That was my first thought when I encountered this, and I checked that code. This code is being written as an upgrade to improve large inserts, and the code that generates the products is older that worked fine with the old update system. Also, the list being passed to the function only has ~8000 Products, so it cant be that as far as I can tell.