1

I have a laravel queue job that process around 10 to 12 seconds before succeeding. It does not fail though and not throw any error logs.

Only problem is, it just stops processing the next jobs. I have to run the queue:work again to continue process the jobs.


I have tried like appending the code below and tried putting the highest value as possible (max was 10,000). But none of them worked.

public $timeout = ...;
public $retry = ...;

public function retryUntil()
{
    return now()->addSeconds(...);
}

Also, I have tried some arguments like --memory, --timeout, retry, etc. Also not working.


And also I think that our php.ini configuration seems high already.

memory_limit = 1024M
post_max_size = 16M
upload_max_filesize = 16M
max_input_vars = 3000
max_execution_time = 600
tempra
  • 2,455
  • 1
  • 12
  • 27
  • Keep in mind the queue will more likely use the CLI's `php.ini` or even without execution limit, not the web server's `php.ini`. The queue is meant like a queue - one task finishes after another. It must be continously started to process the next entry. Maybe you're asking how to put this into a cronjob? – Peter Krebs Mar 08 '23 at 16:22
  • Does this answer your question? [Laravel queue worker with cron](https://stackoverflow.com/questions/46487907/laravel-queue-worker-with-cron) – Peter Krebs Mar 08 '23 at 16:22
  • @PeterKrebs no. i'm not using cronjob. And also the suggested link does not answer the question. Though I already tried that one before, but didn't work as well. – tempra Mar 08 '23 at 16:56

1 Answers1

1

The queue:work command executes only what is in the execution queue and finishes. To keep track of new jbos you need to use queue:watch, however this locks your terminal and needs it open. I recommend using Supervisor to automate the process, here is the documentation.