I have a Flask
REST API which leverages Celery
for running async requests.
The idea is that an async=1
query parameter indicates the request should be processed asynchronously (returning a task ID immediately which the client will use later).
At the same time I want to prevent accepting new tasks when there're too many waiting for processing.
The code below works, but accepting_new_tasks()
takes ~2 seconds which is way too slow.
Is there a config (or something) in Celery that allows to limit the number of waiting tasks; or a faster way to get the number of waiting tasks?
import math
from celery import Celery
from flask import abort, Flask, jsonify, request
flask_app = Flask(__name__)
celery_app = Celery("tasks", broker="rabbit...")
@flask_app.route("/")
def home():
async_ = request.args.get("async")
settings = request.args.get("settings")
if async_:
if not accepting_new_tasks(celery_app):
return abort(503)
task = celery_app.send_task(name="my-task", kwargs={"settings": settings})
return jsonify({"taskId": task.id})
return jsonify({})
def accepting_new_tasks(celery_app):
inspector = celery_app.control.inspect()
nodes_stats = inspector.stats()
nodes_reserved = inspector.reserved()
workers = 0
for stats in nodes_stats.values():
workers += stats["pool"]["max-concurrency"]
waiting_tasks = 0
for reserved in nodes_reserved.values():
waiting_tasks += len(reserved)
return waiting_tasks < math.ceil(workers / 3)