0

I have a Django applications serving multiple users. Each user can submit resource-intensive tasks (minutes to hours) to be executed. I want to execute the tasks based on a fair distribution of the resources. The backend uses Celery and RabbitMQ for task execution.

I have looked extensively and haven't been able to find any solution for my particular case (or haven't been able to piece it together.) As far as I can tell, there isn't any build-in features able to do this in Celery and RabbitMQ. Is it possible to have custom code to handle the order of execution of the tasks? This would allow to calculate priorities based on user data and chose which task should be executed next.

Related: How can Celery distribute users' tasks in a fair way?

Roberto
  • 151
  • 1
  • 5

1 Answers1

1

The AMPQ queues are FIFO. So it is impossible to grab items from the middle of the queue to execute. The two solutions that come to mind are:

a.) As mentioned in the other post, use a lock to limit resources by user.

b.) Have 2 queues; a submission queue and an execution queue. The submission queue keeps the execution queue full of work based on whatever algorithm you choose to implement. This will likely be more complex, but may be more along the lines of what you are looking for.

Robert Kearns
  • 1,631
  • 1
  • 8
  • 15
  • The tasks are executed when they are submitted though. To implement these solutions, the tasks would need to be sometimes rejected and requeued (solution a) or consumed only when the execution queue has free resources (solution b). Is this possible in Celery/RabbitMQ or I would need to bypass them (e.g. by using files for jobs) and having a periodically executed task to queue the tasks? – Roberto Jun 05 '20 at 11:46