5

Suppose I need to request multiple servers to make a response

def view_or_viewset(request):

  d1 = request_a_server() # something like requests.get(url, data)
  d2 = request_b_server()
  d3 = request_c_server()

  d4 = do_something_with(d3)

  return Response({"foo1": d1, "foo2": d2, "foo3": d3, "foo4": d4})

I'm doing synchronous requests for each request, and I guess there must be a better way of handling this kind of scenario..

(I would use celery if it is long task, but it is not, still doing multiple synchronous requests doesn't seem right)

What's the recommended paradigm (?) of handling this?

  • edit

I was expecting use of async or aioHttp and yield(?)

and my question was flagged with possible duplicates and the answers there suggest using threading.. I think manually handling threading is something to avoid (from my past experience with multi threading in c++)

then I found https://stackoverflow.com/a/23902034/433570 request-future seems to be promising here..

eugene
  • 39,839
  • 68
  • 255
  • 489
  • 1
    You can use [`multiprocessing`](https://docs.python.org/2/library/multiprocessing.html) to process collections in parallel. – Willem Van Onsem May 18 '19 at 14:30
  • Check out https://stackoverflow.com/questions/17601698/can-django-do-multi-thread-works or https://stackoverflow.com/questions/21945052/simple-approach-to-launching-background-task-in-django/21945663#21945663 – frnhr May 18 '19 at 15:22
  • 1
    Possible duplicate of [Simple approach to launching background task in Django](https://stackoverflow.com/questions/21945052/simple-approach-to-launching-background-task-in-django) – frnhr May 18 '19 at 15:22
  • 1
    https://stackoverflow.com/questions/50757497/simplest-async-await-example-possible-in-python – VnC May 18 '19 at 16:52

0 Answers0