4

I'm developing a system that has two components - a Django web application and a monitor process.

The monitor process connects to a TCP/IP server somewhere and receives event indications using a proprietary protocol. It can also send commands to the server, again, with a proprietary protocol.

I want to run the monitor as a daemon. It will connect to the server and continuously monitor the incoming events. Whenever an event arrives, the monitor will update the database. The web application will get the current state from the database.

I'm a bit stuck with sending commands, though. I will need the Django web-app to somehow communicate with the monitor service. I can use Pyro, as recommended here, but I'd much rather use Celery, as I'm already using it for other parts of the system.

How can I get a Celery worker to both manage the monitor connection and serve Celery requests?

Note: Due to limitations of the proprietary protocol, I can't open two connections to the server, and so can't have one process monitor the event and another sending commands.

Community
  • 1
  • 1
zmbq
  • 38,013
  • 14
  • 101
  • 171

1 Answers1

3

If you really want to use Celery for this use case, I suggest it'd be better if you had defined a separated queue i.e. server_monitor and route all server monitor tasks to that queue. Now, to avoid having multiple connections to the server, run the worker with -c 1.

Also, since you want to be able to process server monitor events and Celery requests, order the new worker to serve both queues -q celery,server_monitor. This way, the worker will serve both types of requests, but beware - if your celery queue is under heavy traffic, it might take a long time to process a request from server_monitor queue.

Maciej Gol
  • 15,394
  • 4
  • 33
  • 51
  • The problem is with the 'server monitor tasks' - the monitor runs these tasks and *also* monitors - using a straight socket and select loop. – zmbq Oct 05 '13 at 08:58
  • Since there would be only a single process running those tasks, you can create a monitor inside the worker process that will receive and send commands to the server. – Maciej Gol Oct 05 '13 at 09:03
  • Start another thread in the worker? Cool idea! I can do that in a specific task (start_monitoring), but I'd rather it happened automatically when the daemon starts. Do workers get to handle such start-up events? – zmbq Oct 05 '13 at 09:04
  • 1
    There are multiple signals you can connect to, used in [Worker class](https://github.com/celery/celery/blob/master/celery/apps/worker.py). The whole listing of signals is [here](https://github.com/celery/celery/blob/master/celery/signals.py). You can connect to the `signals.celeryd_after_setup` signal in your worker code (this should be in a global namespace so it's ran immediately), where you are passed whole configuration allowing to decide actions on passed worker params. – Maciej Gol Oct 05 '13 at 09:16