0

I recently started having a need for a task queue in my Django program, and I'm worried about how robust it will be in the future or what will need to be overcome for a production deployment.

I'm using the Redis-Queue or RQ library for Python, which markets itself as easier to learn and use than something like Celery (which I haven't quite learned). Does anyone have any input on this? Do you think RQ coupled with Redis would be OK in production, or would you use something else? What do large-scale apps use for task queueing?

Ian Moore
  • 35
  • 1
  • 8

2 Answers2

0

Celery works in production, as far as I know, for Python/Django, the most mature implementation of synchronous task queue/job queue based on distributed message passing, with Redis as a broker works just fine.

What you would need in production is not only send a simple message but a high-level API for:

  • Scale, auto-scale.
  • Real-time monitor.
  • Task schedulers.
  • Prioritization.
  • Multi broker support.
  • Workflow based tasks.

For which Celery is ready.

panchicore
  • 11,451
  • 12
  • 74
  • 100
0

Yes, redis and redis-queue is very easy to setup. And yes, they can be used in a production system. You can also take a look at RabbitMQ. It can be used as task queue. It support persistence and clustering for scaling.

Adam
  • 26
  • 2
  • Could you tell me if RabbitMQ can be used independently or does it need something like Celery to go with it? I've looked into RabbitMQ but it appears that I must use Celery or build some sort of custom handler to go along with it. (EDIT: I did find https://stackoverflow.com/questions/9077687/why-use-celery-instead-of-rabbitmq, but would still like some additional perspectives.) – Ian Moore Jul 15 '18 at 01:25