6

I'm trying to use Redis as a broker for Celery for my Django project that uses Docker Compose. I can't figure out what exactly I've done wrong, but despite the fact that the console log messages are telling me that Redis is running and accepting connections (and indeed, when I do docker ps, I can see the container running), I still get an error about the connection being refused. I even did

docker exec -it <redis_container_name> redis-cli
ping

and saw that the response was PONG.

Here are the Celery settings in my settings.py:

BROKER_URL = 'redis://localhost:6379/0'
BROKER_TRANSPORT = 'redis'
CELERY_RESULT_BACKEND = 'redis://localhost:6379/0'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_ENABLE_UTC = True
CELERY_TIMEZONE = "UTC"

Here are the Redis container settings in my docker-compose.yml:

redis:
    image: redis
    ports:
        - "6379:6379"

I remembered to link the redis container with my web container as well. I can start up the server just fine, but I get the connection refused error when I try to upload anything to the site. What exactly is going wrong?

EDIT: I remembered to use VBoxManage to port forward such that I can go to my browser and access my site at localhost:8000, so it doesn't seem like I need to use the VM's IP instead of localhost for my settings.py.

EDIT 2: If I replace localhost in the settings with either the IP address of the docker-machine VM or the IP address of the Redis container, then what happens is that I really quickly get a false success message on my website when I upload a file, but then nothing actually gets uploaded. The underlying upload function, insertIntoDatabase(), uses delay.

Dan K
  • 803
  • 8
  • 26
  • 1
    Are you using boot2docker? If yes, you should use the boot2docker ip command to get the docker engine VM ip address and use it instead of localhost – Alex da Silva Sep 16 '15 at 19:44
  • ^I had the same issue and described above: I had to use IP address entry in /etc/hosts/ created by docker-compose. – erewok Sep 16 '15 at 23:18
  • did you try to replace `localhost` to `redis` in setting.py, because you already have the `--link` option when start web container. – BMW Sep 18 '15 at 03:01
  • I used VBoxManage to port forward so that's not the issue. Also, I'm using docker-machine which replaced boot2docker. @BMW I also tried that, and I get time-out errors whenever I do that. – Dan K Sep 18 '15 at 19:52

2 Answers2

5

I just had similar problem due to updating Celery from v3.1 to v4 and according to this tutorial it was needed to change BROKER_URL to CELERY_BROKER_URL in the settings.py

settings.py part

CELERY_BROKER_URL = 'redis://cache:6379/0'
CELERY_RESULT_BACKEND = 'redis://cache:6379/0'

docker-compose.yml part

version: '2'
services:
  web:
    container_name: django-container
    *******
    other options
    *******

    depends_on:
      - cache
      - db

  cache:
    container_name: redis-container
    restart: always
    image: redis:latest
TitanFighter
  • 4,582
  • 3
  • 45
  • 73
3

Is Django running in a seperate container that is linked to the Redis container? If so, you should have some environment variables with the Ip and port that Django should use to connect to the Redis container. Set BROKER_URL to use the redis Ip and port env vars and you should be in business. Ditto for RESULT_BACKEND.

Reference docs for the env vars are here: Docker Compose docs

Here's some example code for how we use the automatically added env vars in one of our projects at OfferUp:

BROKER_TRANSPORT = "redis"
_REDIS_LOCATION = 'redis://{}:{}'.format(os.environ.get("REDIS_PORT_6379_TCP_ADDR"), os.environ.get("REDIS_PORT_6379_TCP_PORT"))
BROKER_URL = _REDIS_LOCATION + "/0"
CELERY_RESULT_BACKEND = _REDIS_LOCATION + "/1"
kamartem
  • 994
  • 1
  • 12
  • 22
Sean Azlin
  • 886
  • 7
  • 21
  • If I understood you correctly, what I did was use `docker inspect` to find the IP of the Redis container and then in my `settings.py` I changed `localhost` to that IP. What that did created a strange bug for me where, if I upload a file to my database, it'll instantly say it was successful but nothing actually gets uploaded. Did I at least understand your comment correctly? – Dan K Sep 18 '15 at 20:23
  • Not quite - when you link docker containers and start them using docker compose, docker compose automatically inserts environment variables into each running container with the ip, port, and protocol to use to talk to the other containers. You can then use os.environ.get() in your settings.py to load the relevant settings. Here the relevant reference docs: https://docs.docker.com/compose/env/ – Sean Azlin Sep 19 '15 at 01:34
  • I've added some example code to my answer for how you can use the docker compose env vars to, for example, configure celery to use a linked redis container. – Sean Azlin Sep 19 '15 at 01:46
  • Thanks, it seems like following your example and fitting it to my project is the solution in terms of no longer getting the "connection refused" error. Still, I'm now getting the same bug where a file upload will be too quick and I'll be told that it was successful when in fact nothing actually happened. Is this same issue still related to the settings or is this a different thing with celery? The upload function, `insertIntoDatabase()`, uses `.delay()`. – Dan K Sep 19 '15 at 22:35
  • 1
    Sounds like a different thing related to how you're handling the upload and/or working with Celery. It'd probably be ideal if you posted a new question that includes the view code involved. – Sean Azlin Sep 20 '15 at 04:48
  • That's what I thought, thanks for confirming that. This means that you did answer the original question, so I'm going ahead and choosing your answer now. – Dan K Sep 20 '15 at 18:46