I've a huge model with a lot of attributes including multiple ManyToManyMapping. Most of the addition/update in app is via REST API, but for minor correction I have used Django Admin Form. This admin form also has multiple inline formset.
I want to publish some event to Kafka(publish_event
) after the model is updated either through form or REST API. And I want this to happen when the transaction is committed to DB so that services listening to Kafka events don't end up fetching stale data from DB.
I referred this SO post but it appears to be doing it on every transaction not on per model basis and having on_commit poses problems of things getting called twice(more below).
Things I've tried so far:
Signals: Rejected since due to adding ManyToManyMapping,
model.save()
needs to be called twice which ended up with 2 events published. Also, it operates on model save, not transaction commit, so in case of rollback, I will still end up with publishing an event.Overriding model's
save(self, *args, **kwargs):
method: Rejected for same reason asmodel.save()
is called twice.Overriding ModelAdmin's
save_model
: This is one of the first things to be called when we hit Save on form, so overriding this is not helping because formset's are still not processed yet. So, full state including M2M mappings are not committed in DB.
def save_model(self, request, obj, form, change):
super().save_model(request, obj, form, change)
publish_event()
- Overriding ModelAdmin's
save_related
: This seemed to be the solution at first, but again transaction is not yet committed to DB.def save_related(self, request, form, formsets, change): form.save_m2m() for formset in formsets: self.save_formset(request, form, formset, change=change) publish_event()
So far I'm yet to figure out any callback triggered post transaction commit.