3

I am currently writing a django App which will have multiple deployments. Each deployment will have a sub-domain. That is, the same app will be running over different virtualenvs and will share a MySQL database (multiple databases in the same installation) and a Redis Server.

The point where I am stuck is managing the tasks in a manner that they don't get mixed up. I have been reading on this all day and could't find a solution which was accepted as a standard.

Some of the solutions proposed were,

  1. Having separate queues for apps and have a worker listen to each queue.
  2. Having separate database numbers in the CELERY_BROKER_URL.

Regarding solution 2, I couldn't understand how it will work and how the consistency is preserved.

Regarding solution 1, I am worried that a single mistake in the worker-queue mapping could be disastrous to the server as all the databases will have the same schema and would accept any data send even if it is supposed to go to the database of some other app instance.

Please provide some insight on this issue. Thank You!

  • 1
    I'm not sure I understand your difficulties with option 2. Give them separate database numbers, and the data will be separate, surely? – Daniel Roseman Nov 23 '15 at 18:32
  • This answers the question https://stackoverflow.com/questions/13924926/running-multiple-django-celery-websites-on-same-server – GunnerFan May 23 '20 at 10:25

0 Answers0