1

I used the following commands to set my queue process in execution forever even after I close the server terminal. But it stops as soon as I close the terminal. Please help me with this. How to put it always running in the background. You can see I used all nohup commands but no luck.

1) nohup php artisan queue:work --tries=1 </dev/null >/dev/null 2>&1 &
2) nohup php artisan queue:work --tries=1 >/dev/null 2>&1 &
3) nohup php artisan queue:work --daemon > /dev/null 2>&1 &
4) nohup php artisan queue:work > /dev/null 2>&1 &
5) nohup php artisan queue:work --tries=1
6) nohup php artisan queue:listen >/dev/null 2>&1 &

Note: I am not having root access of the server. I am using user created from WHM. IDK if that is the problem.

Himanshu Upadhyay
  • 6,558
  • 1
  • 20
  • 33
  • better to run it from a `crontab` or create a `systemd` thingy? – shellter Jul 04 '19 at 14:44
  • Possible duplicate of [In Linux, how to prevent a background process from being stopped after closing SSH client](https://stackoverflow.com/questions/285015/in-linux-how-to-prevent-a-background-process-from-being-stopped-after-closing-s) – Frak Jul 04 '19 at 14:45
  • when you're having trouble, you don't want to discard all the information that might appear on `std-out` and `std-err`. I recommend redirecting that output to tmp files and see if there is any helpful evidence. What you have seems like it should work, (Except that your line 5 is not backgrounded, (typo?)). Good luck. – shellter Jul 04 '19 at 14:52
  • @frakman1 it is not helpful as you can see I am already using nohup to solve the issue. – Himanshu Upadhyay Jul 04 '19 at 14:56
  • @shellter, I did not understand your concern for line #5. – Himanshu Upadhyay Jul 04 '19 at 14:57
  • Is it missing the `&` at the end, or is that deliberate? As is, that job will have to complete before it goes on to line 6. Good luck. – shellter Jul 04 '19 at 14:58
  • Also, let us know (by editing your Q), are you using `ssh`, `putty` (which mode), or something else. Also, yes agree that you are already using `nohup`, but there are a lot of other good solutions in that answer. Good luck. – shellter Jul 04 '19 at 14:59
  • You might also try to see if it something special about `php artisan` and run a very simple `nohup` test, maybe `nohup sleep 360 &`, then copy/paste the returned PID, restart your terminal is `ps -ef | grep $PIDyouCaptured`. Good luck. – shellter Jul 04 '19 at 15:09

2 Answers2

2

Go to CPanel -> Cron Jobs page

And create a cron job by adding a command like this:

* * * * * cd /path-to-your-project && php artisan schedule:run >> /dev/null 2>&1

Don't forget to change /path-to-your-project as your project folder

It will run laravel scheduled commands.

And put this to schedule method in app/Console/Kernel.php file.

$schedule->command('queue:work --stop-when-empty')->everyMinute()->withoutOverlapping();
Erkan Özkök
  • 895
  • 12
  • 25
  • çok sağlıklı görünmüyor. `--once` desen anlarım.. Ama yukarıdaki komut her dakika yeni bir deaoman başlatacak. 1 saatte 60 instance 24 saatte bir boot edilen server düşün adam gün sonunda makinesinde 1440 instance çalıştıracak.. – Teoman Tıngır Jan 14 '20 at 10:47
  • @TeomanTıngır haklısın öyle bir problem var ona dikkat etmemişim. Ama --once kuyruktan 1 job işleyip kapanıyor. Bahsettiğin problem için --stop-when-empty seçeneğini ekleyerek yanıtı güncelliyorum. – Erkan Özkök Jan 14 '20 at 12:57
  • 1
    cron ile çok sağlıklı değil açıkçası queue işlemlerini başlatmak. doğrudan servis yazıp boot anında artisan komutu çalıştırmak daha sağlıklı. wanted by mult user target der geçersin. ama işte hostinglerde sıkıntı :)) – Teoman Tıngır Jan 14 '20 at 14:33
0

Afraid that is not possible, when you stop the command or close your connection the process will stop.

From the Laravel documentation

To keep the queue:work process running permanently in the background, you should use a process monitor such as Supervisor to ensure that the queue worker does not stop running.

https://laravel.com/docs/5.8/queues#supervisor-configuration

Robbin Benard
  • 1,542
  • 11
  • 15
  • That should work, but it's a quick fix and not a good long term solution. The process could stop because of an error or when the server is rebooted etc. – Robbin Benard Jul 04 '19 at 15:08