3

Is it possible to create Celery task that just waits for a signal? I have this scenario:

  • Scrapyd in one virtualenv on remote machine A
  • Django project with Celery worker node in another virtualenv on remote machine A
  • The same Django project with Celery, but in another virtualenv on local machine B

How I use this setup:

  1. I would send a task chain from Django on machine B
  2. Let the task chain be consumed by the worker node on machine A.
  3. In the first subtask of the task chain, I would schedule a crawl using Scrapyd's JSON over HTTP API, and pass the Celery task ID to the crawler as an HTTP request parameter.
  4. I then want this first subtask to just wait for some kind of signal.
  5. Scrapyd does its thing and runs the spider.
  6. Once the spider is done crawling, I want it to send a signal, maybe by JSON over HTTP or by a Django management command, to the subtask that has been waiting for the signal.

Is this doable?

I would just need code snippets to show me how to wait for a signal in a subtask, and how to restore a task from the task ID and send a signal to it.

Kal
  • 1,707
  • 15
  • 29

0 Answers0