I have a table that contains greater than half a million records. Each record contains about 60 fields but we only make changes to three of them.
We make a small modification to each entity based on a calculation and a look up.
Clearly I can't update each entity in turn and then SaveChanges
as that would take far too long.
So at the end of the whole process I call SaveChanges
on the Context
.
This is causing an Out of Memory error when i apply SaveChanges
I'm using the DataRepository pattern.
//Update code
DataRepository<ExportOrderSKUData> repoExportOrders = new DataRepository<ExportOrderSKUData>();
foreach (ExportOrderSKUData grpDCItem in repoExportOrders.all())
{
..make changes to enity..
}
repoExportOrders.SaveChanges();
//Data repository snip
public DataRepository()
{
_context = new tomEntities();
_objectSet = _context.CreateObjectSet<T>();
}
public List<T> All()
{
return _objectSet.ToList<T>();
}
public void SaveChanges()
{
_context.SaveChanges();
}
What should I be looking for in this instance?