As I said in the title, I want a task to run in the background processing a large database query while I handle requests from my frontend. Can I even do that with async/asyncio? I was told it was possible...
For the purpose of context, I would like to do something like the following. Note also that I don't really need the function to tell me when it's done (though I'd sure like to know if it's possible) since I just check if the .json file was finally written.
def post_data_view(request):
if request.method == 'POST':
...
do_long_query_in_the_background(some_data)
return HttpResponse('Received. Ask me in a while if I finished.')
def is_it_done_view(request):
if request.method == 'GET':
data = find_json()
if data:
return JsonResponse(data)
else:
return HttpResponse('Not yet dude...')
async def do_long_query_in_the_background(data):
# do some long processing...
# dump the result to a result.json
return
I was told this was possible with async but I really find it hard to understand. For context, I tried to simplify this a lot, and even then I found I didn't quite understand what was happening:
async def f():
while True:
print(0)
await asyncio.sleep(2)
asyncio.create_task(f())
even this code I tried fails sys:1: RuntimeWarning: coroutine 'f' was never awaited
, but it does work on console and I do not understand why that is.
I also wondered if this is at all possible and safe to do with threading maybe?
I'm extremely frustrated with this because the general solution suggested in other threads seems to be to just use celery but it really feels like an overkill for a not so complex problem.