HTTP is a synchronous protocol. Each waiting client consumes server resources (CPU, memory, file descriptors) while waiting for a response. This means that web server has to respond quickly. HTTP web server should not block on external long-running processes when responding to a request.
The solution is to process requests asynchronously. There are two major options:
Use polling.
POST
pushes a new task to a message queue:
POST /api/generate_report
{
"report_id": 1337
}
GET
checks the MQ (or a database) for a result:
GET /api/report?id=1337
{
"ready": false
}
GET /api/report?id=1337
{
"ready": true,
"report": "Lorem ipsum..."
}
Asynchronous tasks in Django ecosystem are usually implemented using Celery, but you can use any MQ directly.
Use WebSockets.
Helpful links:
- What are Long-Polling, Websockets, Server-Sent Events (SSE) and Comet?
- https://en.wikipedia.org/wiki/Push_technology
- https://www.reddit.com/r/django/comments/4kcitl/help_in_design_for_long_running_requests/
- https://realpython.com/asynchronous-tasks-with-django-and-celery/
- https://blog.heroku.com/in_deep_with_django_channels_the_future_of_real_time_apps_in_django
Edit:
Here is a pseudocode example of how you can reuse a connection to a MQ:
projectName/appName/services.py
:
import stomp
def create_connection():
conn = stomp.Connection([('localhost', 9998)])
conn.start()
conn.connect(wait=True)
return conn
print('This code will be executed only once per thread')
activemq = create_connection()
projectName/appName/views.py
:
from django.http import HttpResponse
from .services import activemq
def index(request):
activemq.send(message='foo', destination='bar')
return HttpResponse('Success!')