17

I'm trying to write some unit tests for some celery tasks in my Django app. These tasks take a model id as the argument, do some stuff, and update the model. When running a devserver and celery worker, everything works great, but when running my tests, it has become clear that the celery task is not using the django test db that gets created and destroyed as part of the test run. Question is, how can I get celery to use the same temporary db as the rest of my tests?

As you can see, I'm using the settings overrides that are suggested in every answer for similar issues.

UPDATE: Discovered that instead of passing the object id to the task and having the task get it from the db, if I simply pass the object itself to task, the tests work correctly with apparently no adverse effects on the functioning of the task. So at least for now, that will be my fix.

In my test:

class JobTest(TestCase):

    @override_settings(CELERY_ALWAYS_EAGER=True,
                       CELERY_EAGER_PROPAGATES_EXCEPTIONS=True,
                       BROKER_BACKEND='memory')
    def test_Job_Complete(self):
        job = models.Job()
        job.save()
        tasks.do_a_thing(job.id)
        self.assertTrue(job.complete)

In my task:

@celery.task
def do_a_thing(job_id):
    job = models.Job.objects.get(pk=job_id)
    bunch_of_things(job)
    job.complete = True
    job.save()
ecline6
  • 896
  • 8
  • 15
  • 1
    However, Passing in the object itself in a task might give you side-effects such as reverting the data that had once been saved and such. Is this still your solution or do you have another solution? – Jonathan Sep 12 '14 at 11:56
  • Unfortunately, passing the object itself doesn't work for me since it contains ManyToManyFields. The contents of the m2m fields don't get transferred to the tasks correctly. – Ginkobonsai Oct 15 '18 at 15:51

4 Answers4

5

One way to guarantee that the Celery worker is configured to use the same test database as the tests is to spawn the Celery worker inside the test itself. This can be done by using start_worker

from celery.contrib.testing.worker import start_worker
from myproject.celery import app
def setUpClass(self):
   start_worker(app)

method of the TestCase.

You have to also use a SimpleTestCase from Django or an APISimpleTestCase from Rest rather than a plain TestCase so that the Celery thread and the test thread can see the changes that each other make to the test database. The changes are still destroyed at the end of testing, but they are not destroyed between tests unless you manually destroy them in the tearDown method.

Ali Husham
  • 816
  • 10
  • 31
drhagen
  • 8,331
  • 8
  • 53
  • 82
  • SimpleTestCase is not suitable for tests with database queries. Do you have any solutions for these cases? – muradin Aug 21 '21 at 10:30
4

I battled with a similar problem. The following solution is not clean but it works.

  1. Create a separate Django settings file that inherits from your main one. Let's call it integration_testing.py.
  2. Your file should look like this:
    from .settings import *

    DATABASES = { 'default': { 'ENGINE': '<your engine>', 'NAME': 'test_<your database name>', 'USER': <your db user>, 'PASSWORD': <your db password>, 'HOST': <your hostname>, 'PORT': <your port number>, }

  3. Create a shell script which will set your environment and start up the celery worker:

    #!/usr/bin/env bash

    export DJANGO_SETTINGS_MODULE="YOURPROJECTNAME.settings.integration_testing"

    celery purge -A YOURPROJECTNAME -f && celery worker -A YOURPROJECTNAME -l debug

  4. The above works if you configured celery in this manner:

    app = Celery('YOURPROJECTNAME')

    app.config_from_object('django.conf:settings', namespace='CELERY')

  5. Run the script in the background.

  6. Make all tests that involve Celery inherit from TransactionTestCase (or APITransactionTestCase in django-rest-framework)

  7. Run your unit tests that use celery. Any celery tasks will now use your test db. And hope for the best.

Marviel
  • 132
  • 8
Fred Campos
  • 1,457
  • 1
  • 19
  • 22
  • 1
    This solution worked great for me, but I just wanted to add that **_with some databases, in order for Celery to see the database models, you must use some variant of TransactionTestCase or APITransactionTestCase._** This is because django uses database transactions to speed up testing. https://stackoverflow.com/questions/35305997/why-isnt-django-actually-writing-changes-to-test-db – Marviel Nov 07 '17 at 12:43
2

There's no obvious problem with your code. You don't need to run a celery worker. With these settings celery will run the task synchronously and won't actually send anything to your message queue.

You can't easily run tests with live celery workers anyway because each test is wrapped in a transaction so even if they were connecting to the same database (which they aren't) the transactions are always rolled back by the test and never available to the worker.

If you really need to do this, look at this stackoverflow answer.

Community
  • 1
  • 1
joshua
  • 2,509
  • 17
  • 19
  • I see your point, but Fred Campos' answer actually shows how to do an integration test with Celery, which really is a must for my purposes. – Marviel Nov 07 '17 at 12:48
0

I have found adding the following to conftest.py works:

from django.conf import settings

...

@pytest.fixture(scope="session")
def celery_worker_parameters(django_db_setup):
    assert settings.DATABASES["default"]["NAME"].startswith("test_")
    return {}

The trick is to add here the django_db_setup fixture so it will be enabled on the worker.

This was tests for tests marked with:

@pytest.mark.django_db(transaction=True)
@pytest.mark.celery()
def test_something(celery_worker):
    ...
Udi
  • 29,222
  • 9
  • 96
  • 129
  • Sorry, what did you configure for celery to not use DisabledBackend? In my tasks I cannot call wait() on my tasks, the error says "No result backend is configured". – Oleksandr Zelentsov Feb 17 '21 at 10:17