Is it possible to create Celery task that just waits for a signal? I have this scenario:
- Scrapyd in one virtualenv on remote machine A
- Django project with Celery worker node in another virtualenv on remote machine A
- The same Django project with Celery, but in another virtualenv on local machine B
How I use this setup:
- I would send a task chain from Django on machine B
- Let the task chain be consumed by the worker node on machine A.
- In the first subtask of the task chain, I would schedule a crawl using Scrapyd's JSON over HTTP API, and pass the Celery task ID to the crawler as an HTTP request parameter.
- I then want this first subtask to just wait for some kind of signal.
- Scrapyd does its thing and runs the spider.
- Once the spider is done crawling, I want it to send a signal, maybe by JSON over HTTP or by a Django management command, to the subtask that has been waiting for the signal.
Is this doable?
I would just need code snippets to show me how to wait for a signal in a subtask, and how to restore a task from the task ID and send a signal to it.