16

I have a flask application which listens for some job to do. The process is quite long (let us say 1 minute) and I would like not allow to process two requests at the same time.

I will be great if once I receive a request, I could close the port flask is listening to and open again when finish. Alternatively I could setup a semaphore but I am not sure about how flask is running concurrently.

Any advice?

from flask import Flask, request
app = Flask(__name__)

@app.route("/",methods=['GET'])
def say_hi():
    return "get not allowed"

@app.route("/",methods=['POST'])
def main_process():
    # heavy process here to run alone
    return "Done"

if __name__ == "__main__":
    app.run(debug=True,host='0.0.0.0')
mosh442
  • 708
  • 2
  • 6
  • 15
  • How are you planing on running flask? Through flask directly or are you runing it as a WSGI module? – Georg Schölly Feb 19 '17 at 08:34
  • I am using wsgi module – mosh442 Feb 19 '17 at 08:36
  • 1
    In this case it might be a bit more complicated. A WSGI server (depending on the configuration) can spawn multiple processes in parallel, but Python's locks only work across threads, not across processes. You need to introduce a shared resource you can lock. That could be the database, a file or a shared lock, for example a [named semaphore](http://stackoverflow.com/q/2798727). – Georg Schölly Feb 19 '17 at 08:59
  • I see this is an old question, but i solved this problem by introducing a load balancer that redirects the requests to "not busy" servers – Kots Mar 04 '22 at 09:15
  • 1
    My reading of https://docs.python.org/3/library/multiprocessing.html#synchronization-primitives suggests that using a `Lock` object from the `multiprocessing` library is exactly the right thing to gain a lock across all threads and/or processes spawned by WSGI. – chrisinmtown Jun 13 '23 at 17:40

3 Answers3

15

You could use a semaphore for this:

import threading
import time
sem = threading.Semaphore()

@app.route("/",methods=['POST'])
def main_process():
    sem.acquire()
    # heavy process here to run alone
    sem.release()
    return "Done"

The semaphore usage is to control the access to a common resource.

You can see more info about semaphore in here

This SO question can help you as well here

EDIT:

As Georg Schölly wrote in comment, The above mentioned solution is problematic in a situation of multiple services.

Although, you can use the wsgi in order to accomplish your goal.

@app.route("/",methods=['POST'])
def main_process():
    uwsgi.lock()
    # Critical section
    # heavy process here to run alone
    uwsgi.unlock()
    return "Done"

uWSGI supports a configurable number of locks you can use to synchronize worker processes

For more info, read here

omri_saadon
  • 10,193
  • 7
  • 33
  • 58
  • 3
    This does not work if there are multiple processes. The lock here does not work across multiple processes, which is the normal configuration for web servers. – Georg Schölly Feb 19 '17 at 09:01
  • 3
    @GeorgSchölly , Thanks for your comment, i've added an edit section regarding your comment. – omri_saadon Feb 19 '17 at 09:37
  • Just to understand fully, when you mean multiple processes, you mean running multiple python scripts and having locking work across them all? and Semaphore and Lock do not do this? – Rahim Khoja Apr 14 '22 at 18:22
-1

I tested it and it worked as expected. I suggest using it with caution, as this can lock threads on your server and overload it. After rethinking I decided that I will not need it in my context

def single_threaded_endpoint(f):
    lock = Lock()

    @wraps(f)
    def decorated_function(*args, **kwargs):
        with lock:
            return f(*args, **kwargs)

    return decorated_function

@single_threaded_endpoint
def stripe_notification():
    ...
Mithsew
  • 1,129
  • 8
  • 20
-2

You could try adding a threading.Lock to indicate that some work is already in progress:

import threading
from contextlib import ExitStack

busy = threading.Lock()
@app.route("/",methods=['POST'])
def main_process():
    if not busy.acquire(timeout = 1):
        return 'The application is busy, refresh the page in a few minutes'

    # ensure busy.release() is called even if an exception is thrown
    with ExitStack() as stack:
        stack.callback(busy.release)
        # heavy process here to run alone

    return "Done"

But Flask by default allows only one request to be processed at a time (more info here), so if you're fine with the fact that during the processing of a single request, all the other users' pages won't load until the process is finished (maybe even get a request timeout error), you don't have to change anything.
If you want to make other users get a message, like in the above code, increase the amount of workers to 2, so that when one worker processes the request, the other one holds off the others.

Community
  • 1
  • 1
illright
  • 3,991
  • 2
  • 29
  • 54
  • Without locks this is prone to race conditions. – Georg Schölly Feb 19 '17 at 08:33
  • 1
    This does not work if there are multiple processes. The lock here does not work across multiple processes, which is the normal configuration for web servers. – Georg Schölly Feb 19 '17 at 09:01
  • @GeorgSchölly there's a `threaded` option you can pass to tell the server that it's supposed to work in threaded mode – illright Feb 19 '17 at 09:04
  • 1
    That's true, but that assumes that the application runs through Flask's integrated server. Especially in production mode this is seldom the case. Mosh states in the comments to his question that he uses it through WSGI. – Georg Schölly Feb 19 '17 at 09:14