I'm having the following setup:
- mysql + php
- Standard symfony 5 setup
- Standard doctrine orm usage
- High traffic
With the raise of traffic I keep encountering the following issue more and more (used to be like 1 time per month, now it got to > 50 per day:
2 requests that would require inserting into the same table the same unique value (lets say the same PK so I can simplify the sample code). My code looks like this:
$existingEntity = $tableRepository->find($id);
if (!$existingEntity) {
$newEntity = new EntityName();
$newEntity->setId($id);
$em->persist($newEntity);
$em->flush();
}
The second request fails with entity manager closed (because mysql returns a duplicate key error). So basically both requests run the check in the same time, both created the entity and persisted it.
Now obviously, the slower the network and the more complicated the code until that particular part of the script, the higher the chance of that to happen. But I optimized everything I could think of, and moved the check as the first thing in the script.
I was thinking to make some kind of lock table in a redis or something but that would basically just move the problem from a place to another. How can I approach this issue?
Edit:
Example without pk:
$existingEntity = $tableRepository->findBy(['columnWithUniqueConstraint' => $value]);
if (!$existingEntity) {
$newEntity = new EntityName();
$newEntity->setColumnWithUniqueContraint($value);
$em->persist($newEntity);
$em->flush();
}