0

I have a simple HTTP server setup like this one. It processes a slow 40 second request to open and then close gates (real metallic gates). If second HTTP query is made during execution of the first one, it is placed in queue and then executed after first run. I don't need this behavior, I need to reply with error if gate open/close procedure is in progress now. How can I do that? There's a parameter 'request_queue_size' - but I'm not sure how to set it.

Community
  • 1
  • 1
Eugene V
  • 103
  • 6
  • Would help to see some code... – heinst Aug 31 '16 at 17:44
  • Perhaps you can have a "cooldown" period that whenever you receive a request within a certain window (Say, a minute to be safe?) after receiving another request you reply with a HTTP 503 Service Unavailable error? – sethmlarson Aug 31 '16 at 17:45
  • I don't have the exact code right now, but it's 99% like in this example: http://www.acmesystems.it/python_httpd – Eugene V Aug 31 '16 at 17:52

3 Answers3

1

You need to follow a different strategy designing your server service. You need to keep the state of the door either in memory or in a database. Then, each time you receive a request to do something on the door, you check the current state of the door in your persistence, and then you execute the action if it is possible to do on the current state, otherwise you return an error. Also, don't forget to update the state of the door once an action completes.

Jadiel de Armas
  • 8,405
  • 7
  • 46
  • 62
  • Thanks, I was thinking about this variant (like semaphore file), but it makes things a lot more complicated. – Eugene V Aug 31 '16 at 17:46
  • It might be more complicated, but is a correct way of doing it. If it is correct, it is worth the try, unless you have another correct solution that is simpler. – Jadiel de Armas Aug 31 '16 at 17:49
  • In fact I cannot process second request if 1st request is runing now. Second request stays in queue without being processed. – Eugene V Sep 01 '16 at 06:21
1

'request_queue_size' seems to have no effect. The solution was to make server multithreaded, and implement locking variable 'busy':

from socketserver import ThreadingMixIn
from http.server import BaseHTTPRequestHandler, HTTPServer
import time
from gpiozero import DigitalOutputDevice
import logging
from time import sleep
logging.basicConfig(format='%(asctime)s %(levelname)s:%(message)s', level=logging.INFO)

hostName = ''
hostPort = 9001
busy = False

class ThreadingServer(ThreadingMixIn, HTTPServer):
    pass

class MyServer(BaseHTTPRequestHandler):
    def do_GET(self):
        global busy
        self.send_response(200)
        self.send_header("Content-type", "text/html")
        self.end_headers()
        self.wfile.write(bytes("Hello!<br>", "utf-8"))
        if self.path == '/gates':
           if not busy:
             busy = True
             relay = DigitalOutputDevice(17) # Initialize GPIO 17
             relay.on()
             logging.info('Cycle started')
             self.wfile.write(bytes("Cycle started<br>", "utf-8"))
             sleep(2)
             relay.close()
             sleep(20)
             relay = DigitalOutputDevice(17)
             relay.on()
             sleep(2)
             relay.close()
             logging.info('Cycle finished')
             self.wfile.write(bytes("Cycle finished", "utf-8"))
             busy = False
           else:
#             self.wfile.write(bytes("Busy now!<br>", "utf-8"))
             self.send_error(503)

myServer = ThreadingServer((hostName, hostPort), MyServer)
print(time.asctime(), "Server Starts - %s:%s" % (hostName, hostPort))

try:
    myServer.serve_forever()
except KeyboardInterrupt:
    pass

myServer.server_close()
print(time.asctime(), "Server Stops - %s:%s" % (hostName, hostPort))
Eugene V
  • 103
  • 6
0

In general, the idea you're looking for is called request throttling. There are lots of implementations of this kind of thing which shouldn't be hard to dig up out there on the Web: here's one for Flask, my microframework of choice - https://flask-limiter.readthedocs.io/en/stable/

Quick usage example:

@app.route("/open_gate")
@limiter.limit("1 per minute")
def slow():
    gate_robot.open_gate()
    return 
Derek Janni
  • 2,397
  • 2
  • 13
  • 13