I have some logic that creates a bunch of records based on a start and end dates. When creating these entities they get added to the collection of their parent entity. Needless to say, this can easily explode from a few entities total to a few hundred, easily into the thousands.
My initial logic was jus spinning through my loop logic creating the entities and adding them to the collection of their parents. So my save was one call i.e:
_context.Parent.Add(Parent);
_context.SaveChanges();
in my testing, the Parent entity could have a possible 1400 entities through all its collections (parent, child, grandchild and great-grandchild entities). The save on EF works as expected.
The problem happens when Audit.Net attempts to write the Json for the AuditEvent.ToJson it throws an out of memory exception. this is the offending line entity.AuditData = ev.ToJson(); which takes about 10 min to bubble up to the start where the error is caught. Persistence has already happened in the DB for all the entities that needed to be persisted.
Audit.Core.Configuration.Setup()
.UseEntityFramework(ef =>
{
ef.UseDbContext<LogDbContext>();
ef.AuditTypeMapper(t => typeof(EntityAuditLog))
.AuditEntityAction<EntityAuditLog>(
(ev, entry, entity) =>
{
entity.AuditData = ev.ToJson();
entity.EntityType = entry.EntityType?.Name;
entity.AuditDate = DateTimeOffset.UtcNow;
entity.AuditAction = entry.Action;
})
.IgnoreMatchedProperties(true);
});
What I did to get around this was break the logic and _context.SaveChanges() into smaller chunks.
I am curious if there are more configurations in Audit.Net that I missed? Or a strategy when building entities with large collections I should be aware of when using Audit.Net?