I have 2 different tasks defined and I would like to route each task to its corresponding queue. However, when I try to specify a queue using the -Q
option, the tasks do not get consumed by Celery. However, everything works as expected if I run without the -Q
option. I am using celery==4.2.0
Am I missing any configuration or setting something up in a wrong manner?
I'm using docker-compose.yml
to run each queue in a different container.
docker-compose.yml
<-- does not work since -Q
is specified
version: '3'
services:
broker:
image: rabbitmq:3
environment: &envfile
RABBITMQ_DEFAULT_USER: user
RABBITMQ_DEFAULT_PASS: password
CELERY_BROKER: {something}
RABBITMQ_BACKEND: {something}
ports:
- 5672:5672
worker1:
build: .
image: worker1
restart: "always"
command: ["celery", "worker", "--app=worker.app", "--concurrency=4", "--hostname=worker1@%h", "--loglevel=INFO", "-Q", "queue1"]
depends_on:
- broker
worker2:
build: .
image: worker2
restart: "always"
environment: *envfile
command: ["celery", "worker", "--app=worker.app", "--concurrency=4", "--hostname=worker2@%h", "--loglevel=INFO", "-Q", "queue1"]
depends_on:
- broker
scheduler:
build: .
image: scheduler
restart: "always"
environment: *envfile
command: ["celery", "beat", "--app=worker.app", "--loglevel=INFO"]
depends_on:
- broker
- worker1
- worker2
My file structure is as follows:
├── rootfolder
│ ├── worker
│ ├── celery.py
│ ├── tasks.py
│ ├── schedule.py
│ ├── __init__.py
└──
celery.py
from celery import Celery
import os
app = Celery(
'worker',
broker=os.environ['CELERY_BROKER'],
backend=os.environ['RABBITMQ_BACKEND']
)
app.conf.beat_schedule = {
"get-message": {
"task": "worker.get_message_list",
"schedule": 60
},
"get-somethingelse": {
"task": "worker.get_something_list",
"schedule": 60
}
}
if __name__ == '__main__':
app.start()