1

In the past I have used Celery async python and Django applications where both the sender and receiver understand the task that is being sent to RabbitMQ for processing via Celery (could be same app in cluster, etc).

I have a use case right now where a .NET service is publishing messages to RabbitMQ in the form of JSON, one message type per queue. The .NET app will only publish a message and make sure it was properly received by Rabbit, then walk away. I will then have a Django application running looking to consume the messages, and so I'm unsure what the proper way is to consume these JSON messages so this Django app can simply store the data via its models and confirm the message as processed.

Using Celery/Kombu I'm unsure the best way to access the queues so that we have a direct consumer per queue. I understand that Celery uses Kombu under the covers, so I imagine that I can create the consumer there, but then I foresee it being impossible to manage the process a la Celery and Flower, and then creating a rogue thread consumer on app start seems flakey at best.

bryan
  • 1,031
  • 2
  • 17
  • 36
  • See answers here: https://stackoverflow.com/questions/11964742/can-a-celery-worker-server-accept-tasks-from-a-non-celery-producer – michauwilliam Sep 12 '18 at 06:56

0 Answers0