4

I have developed an application that takes an image and does some hard work on the GPU. The problem is that if a request is currently being processed (processing some image on the GPU) and another request for image processing comes to the server, then an error occurs related to the logic of using the GPU. Thus, I want each request to be processed by the server sequentially, that is, how to queue requests: do not execute a new request until the previous one has completed. How can this be implemented?

I read about celery and message brokers like RabbitMQ but I don't fully understand whether it should be used in my case

Chris
  • 18,724
  • 6
  • 46
  • 80
padu
  • 689
  • 4
  • 10
  • 2
    A task queue is definitively the way to go. A RabbitMQ queue with a single worker would do what you want. – MatsLindh Feb 19 '23 at 09:56
  • In addition to Mats's suggestion above, have a look at [this answer](https://stackoverflow.com/a/71517830/17865804), which may provide another way for you to achieve sequential processing of client requests, for example, using `async def` endpoints, given that there is **no** `await` call (or `async for` or `async with` block) inside such endpoints. – Chris Feb 20 '23 at 06:32

0 Answers0