8

I have a Django application which I want to configure it celery to run background tasks.

Packages:

  1. celery==4.2.1

  2. Django==2.1.3

  3. Python==3.5

  4. Redis-server==3.0.6

Configuration of celery in settings.py file is:

CELERY_BROKER_URL = 'redis://localhost:6379'

CELERY_RESULT_BACKEND = 'redis://localhost:6379'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TASK_SERIALIZER = 'json'
CELERY_TIMEZONE = 'Asia/Kolkata'
CELERY_BEAT_SCHEDULE = {
    'task-number-one': {
            'task': 'app.tasks.task_number_one',
            'schedule': crontab(minute='*/1'),
    },
}

And celery.py file:

from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
from django.conf import settings
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'project.settings.prod')

app = Celery('project')

# Using a string here means the worker don't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
#   should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings')

# Load task modules from all registered Django app configs.
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)


@app.task(bind=True)
def debug_task(self):
    print('Request: {0!r}'.format(self.request))

When I run : celery -A project worker -l info -B -E

It points to rabmmitmq server, instead it should point to redis-server, shown below:

 -------------- celery@user-desktop v4.2.1 (windowlicker)
---- **** ----- 
--- * ***  * -- Linux-4.15.0-39-generic-x86_64-with-Ubuntu-18.04-bionic 2018-11-21 12:04:51
-- * - **** --- 
- ** ---------- [config]
- ** ---------- .> app:         project:0x7f8b80f78d30
- ** ---------- .> transport:   amqp://guest:**@localhost:5672//
- ** ---------- .> results:     redis://localhost:6379/
- *** --- * --- .> concurrency: 4 (prefork)
-- ******* ---- .> task events: ON
--- ***** ----- 
 -------------- [queues]
                .> celery           exchange=celery(direct) key=celery


[tasks]
  . app.tasks.task_number_one
  . project.celery.debug_task

[2018-11-21 12:04:51,741: INFO/Beat] beat: Starting...

The same happend in production enviornment. In production I have deployed the Django application with Gunicorn and Nginx, and now I want to implement some method to run background tasks, as django-crontab package is not working.

Problem:

  1. What is the problem with celery configuration?

  2. Could anyone please recommend a method to run periodic background task?

**Note: I have tried implementing supervisor, but it seems supervisor is not compatible with python3, and therefore could not configure it.

Reema Parakh
  • 1,347
  • 3
  • 19
  • 46

5 Answers5

9

The setting for the broker url changed in v4. It should be BROKER_URL and not CELERY_BROKER_URL.

2ps
  • 15,099
  • 2
  • 27
  • 47
8

If you have copied the content of celery.py from the celery official website https://docs.celeryproject.org/en/latest/django/first-steps-with-django.html

try changing following line, from

app.config_from_object('django.conf:settings', namespace='CELERY')

to

app.config_from_object('django.conf:settings', namespace='')

Jinesh
  • 2,507
  • 2
  • 22
  • 23
4

Replace CELERY_BROKER_URL = 'redis://localhost:6379' with BROKER_URL = 'redis://localhost:6379'. This worked for me.

Linc
  • 86
  • 4
1

after changing BROKER_URL TO CELERY_BROKER_URL have to change this line in celery.py

app = Celery('proj')

add 'backend='redis://localhost', broker='redis://' so it looks like this

app = Celery('proj', backend='redis://localhost', broker='redis://')

now it will work :)

Ijaz Sahab
  • 11
  • 2
0

If you have redis as broker with .delay() method for queing tasks and got strange connection error 111 refusing connection to rabbitmq (which you don't use at all) try to use .apply_async()

This behavior happens in production.

marin
  • 1
  • 1