1

We are using Laravel with webhook heavily.

We are getting about 10 requests / second via webhook and some services are sending the multiple same request at the same time via webhook. We need to save only 1 entry for the same request. (service_id, param)

We are getting duplicate entry issue in the following logic:

$model = MyModel::where("service_id", $request->service_id)->where("param", $request->param)->first();

if($model)
{
 //update model
}else {
 //create new model
}

When I check service_id and param in MyModel, there are a lot of duplication entries. I think while one is being created, the other same request can query and create new one.

Can anyone help me how to solve this issue? I think we can use Queue so that we can handle syncronously but Queue is not our option for now.

LoveCoding
  • 1,121
  • 2
  • 12
  • 33
  • 2
    Use try/catch. Then you can ignore the error because the record is already in the database. Or re-fetch the existing record and then do something else with it – ljubadr Dec 14 '22 at 22:20
  • 1
    Maybe [this answer](https://stackoverflow.com/a/27879329/3226121) can help – ljubadr Dec 14 '22 at 22:23
  • 2
    Forgot to add - for this to work you need to have unique constraints applied for your database... – ljubadr Dec 14 '22 at 22:39
  • 1
    Unique is the easiest way to do this as it will stop the duplications, and you can soft handle the rest – MysticSeagull Dec 14 '22 at 23:21
  • If we integrate Queue for this logic, will it solve problem? – LoveCoding Dec 15 '22 at 13:01
  • 1
    Queue usually runs with multiple workers so it's possible that 2 different workers will run at the same time. I would suggest to create unique key in database to deal with this. Or you could try to use [Atomic Locks](https://laravel.com/docs/9.x/cache#atomic-locks) – ljubadr Dec 15 '22 at 17:01

1 Answers1

0

If you followed the documents from Laravel's supervisor configuration, there might be 8 daemons.

And there will be 8 running jobs simultaneously.

Or in the other case too, there might be some simultaneously running jobs at the same time.

We can solve this by using middleware inside Laravel Job.

// App\Jobs\MyJob.php
...

public function middleware()
{
    return [new WithoutOverlapping($this->what_ever_id)];
}

This will prevent to run the jobs simultaneously for the same what_ever_id values.

This solved my issue.

LoveCoding
  • 1,121
  • 2
  • 12
  • 33