I am working on a small but computationally-intensive Python app. The computationally-intensive work can be broken into several pieces that can be executed concurrently. I am trying to identify a suitable stack to accomplish this.
Currently I am planning to use a Flask app on Apache2+WSGI with Celery for the task queue.
In the following, will a_long_process()
, another_long_process()
and yet_another_long_process()
execute concurrently if there are 3 or more workers available? Will the Flask app be blocked while the processes are executing?
from the Flask app:
@myapp.route('/foo')
def bar():
task_1 = a_long_process.delay(x, y)
task_1_result = task_1.get(timeout=1)
task_2 = another_long_process.delay(x, y)
task_2_result = task_2.get(timeout=1)
task_3 = yet_another_long_process.delay(x, y)
task_3_result = task_3.get(timeout=1)
return task_1 + task_2 + task_3
tasks.py:
from celery import Celery
celery = Celery('tasks', broker="amqp://guest@localhost//", backend="amqp://")
@celery.task
def a_long_process(x, y):
return something
@celery.task
def another_long_process(x, y):
return something_else
@celery.task
def yet_another_long_process(x, y):
return a_third_thing