1

I have created a simple fullstack app that needs to do some long calculations on the back on a specific request. In order to follow the progress of those calculations, I'm sending polling requests to the backend at a specific time interval.

The problem is that those polling requests stay pending until I manually reload my backend.

#Server side
Initial request

@app.post("/", status_code=202)
async def read_root(
  background_tasks: BackgroundTasks,
  file: Optional[UploadFile] = File(None), 
):
  task_id = str(uuid.uuid4())
  background_tasks.add_task(launch_process, task_id,file)
  #launch_process does the calculations and keeps track of the progress
task_id = str(uuid.uuid4())
 return {"id": task_id }

I'm using nextjs for the frontend but for demonstration purpose I'm using vanilla JS as it doesn't work either.

const progressDiv = document.getElementById('progressDiv')

const sendFile = async (data) => {

 let progress = 0
 progressDiv.textContent = progress
 
 const res1 = await fetch("http:localhost:8000", {
    method: 'post',
    body: data,
 })
 
 const jsonData = await res1.json()
 const id = jsonData.id

 let finish = false
 while(!finish){
  
   const idInterval = setInterval(()=>{
  
      fetch(`http://localhost:8000/polling/${id}`)
           .then(res => res.json())
           .then(json => {
             progress = json.progress 

             progressDiv.textContent = progress

             if(progress === 100){
               finish = true
               clearInterval(idInterval)
             }

           })
   }, 3000)
 }
}

All the polling requests don't go through and stay pending

@app.get("/polling/{task_id}")
def polling(task_id: str):

  progress = store[task_id]["progress"]
  # the store keeps tracks of all the data 

  return {"progress": progress}
Chris
  • 18,724
  • 6
  • 46
  • 80

1 Answers1

1

Thanks to @Chris's suggestion, I understood where my problem came from.

My background task function (launch_process()) had some heavy synchronous computations. Those computations were blocking the main thread. Thus, despite my initial post request giving a response, all the ongoing requests were blocked (stayed pending) by my ongoing launch_process() call.

The way I solved it was to use fastapi.concurrency.run_in_threadpool

I changed my code from :

async def launch_process(task_id, file):
    content = await file.read()
    long_synchronous_function(content, task_id)
    return

To :

from fastapi.concurrency import run_in_threadpool

async def launch_process(task_id, file):
    content = await file.read()
    await run_in_threadpool(lambda: long_synchronous_function(content, task_id))
    return

This post also helped me (it gives several solutions to the problem).

Chris
  • 18,724
  • 6
  • 46
  • 80