2

I'm prototyping a service with Flask that will receive one POST request that may match multiple destinations, each of which will generate a response to send back to the sender. These responses shall not be grouped together into one response. Flask, appropriately, seems to enforce one-response to one-request. Is there a way to [manually] generate and send my responses as needed?

Using the simple Flask hello-world demo as a base, the solution I'm looking for could be as simple as this, if it exists:

@app.route('/', methods=['POST']
def receive():
   if request_matches_more_than_one_handler(request):
      # Let's say we want a total of 3 responses
      # Generate and send 2 responses here. How?
   # Now rely on flask to send the third response
   return '200 OK...'

I cannot dispatch on a path.

Thanks for any input!

patrick
  • 95
  • 2
  • 8
  • You can't wrap up your "responses" in a json object? This feels to me like it's not how HTTP is supposed to work in the first place. Something else that came to mind just now is that you could return in your first response if there's any data left, and whoever asked for the first response knows to send a request for a second or third response... kinda like a paginated solution? – Nicolás Marzano Oct 07 '19 at 18:35
  • HTTP does exactly what you don't want to use. Other way is to use websocket in this scenario. –  Oct 07 '19 at 19:32

1 Answers1

1

Formally that is not possible because of HTTP specifications, but here is one tricky workaround. You may use stream:

Flask server

from flask import Flask, stream_with_context, request, jsonify, Response
from time import sleep
import json

app = Flask(__name__)


def destination1(param):
  sleep(5)
  return {"data": f"{param} from destination1"}


def destination2(param):
  sleep(5)
  return {"data": f"{param} from destination2"}


def destination3(param):
  sleep(5)
  return {"data": f"{param} from destination3"}


@app.route('/destination1', methods=['POST'])
def destination1_route():
  return jsonify(destination1(request.json["param"]))


@app.route('/destination2', methods=['POST'])
def destination2_route():
  return jsonify(destination2(request.json["param"]))


@app.route('/destination3', methods=['POST'])
def destination3_route():
  return jsonify(destination3(request.json["param"]))


@app.route('/multi_destination', methods=['POST'])
def streamed_response():

  destinations = request.json["destinations"]

  def generate(destinations):
    i = 0
    last_element_num = len(destinations)-1
    yield '[\n'
    for destination in destinations:
      if i == last_element_num:
        yield json.dumps(eval(f'{destination["name"]}("{destination["param"]}")')) + '\n'
      else:
        yield json.dumps(eval(f'{destination["name"]}("{destination["param"]}")')) + ',\n'
      i += 1
    yield ']'

  return app.response_class(stream_with_context(generate(destinations)), mimetype='application/json')


if __name__ == "__main__":
    app.run(host="0.0.0.0", port=8000)

Requester

from time import time
import requests

json_data = {
    "destinations": [
        {"name": "destination1", "param": "param1"},
        {"name": "destination2", "param": "param2"},
        {"name": "destination3", "param": "param3"}
    ]
}

start_time = time()
r = requests.post('http://127.0.0.1:8000/multi_destination', json=json_data, timeout=7)
end_time = time()

print('Output:')
print(r.text)

print('HTTP response time:', r.elapsed.total_seconds())
print('Actually elapsed time:', end_time - start_time)

Output

Output:
[
{"data": "param1 from destination1"},
{"data": "param2 from destination2"},
{"data": "param3 from destination3"}
]
HTTP response time: 0.001916
Actually elapsed time: 15.01642894744873

So, as you can see total elapsed time is 15 seconds, but request timeout time is just 7 seconds. Total time is bigger than timeout because we get data chunks every 5 seconds (sleep(5)). So you may stay connected hours and days by sending some data periodically.

In my requester example you must to wait before all data will be transferred. If you want to get response from destination1 and start to process it without waiting response from destination2, you may use iterable way for requests. In this case you will be able to process {"data": "param1 from destination1"}, line immediately without waiting for next data.

Also you may check this answer (Client side) to understand how to process stream chunks without waiting while all data will be downloaded

rzlvmp
  • 7,512
  • 5
  • 16
  • 45