1

I'm having a lot of problem executing certain tasks with celery beat. Some tasks like the one below get triggered by beat but the message is never received by rabbitmq.

In my django settings file I have the following perdiodic task

CELERYBEAT_SCHEDULE = {
    ...
    'update_locations': {
        'task': 'cron.tasks.update_locations',
        'schedule': crontab(hour='10', minute='0')
    },
    ...
}

at 10 UTC beat executes the task as expected

[2015-05-13 10:00:00,046: DEBUG/MainProcess] cron.tasks.update_locations sent. id->a1c53d0e-96ca-4673-9d03-972888581176

but this message is never arrives to rabbitmq (I'm using the tracing module in rabbitmq to track incoming messages). I have several other tasks which seem to run fine but certain tasks like the one above never run. Running the tasks manually in django with cron.tasks.update_locations.delay() runs the task with no problem. Note my Rabbitmq is on a different server than beat.

Is there anything I can do to ensure the message was actually sent and/or received by rabbitmq? Is there a better or other way to schedule these tasks to ensure they run?

Pim
  • 608
  • 1
  • 6
  • 17
  • For the time being I have setup crontab to run the scheduled tasks using custom django command. If the tasks execute as expected and are acknowledged by rabbitmq then problem is solely that of beat. – Pim May 13 '15 at 18:34

1 Answers1

0

A bit hard to answer from these minimal descriptions.

why is this in the Django settings file? I would have expected the Celery config settings to have their own config object. Look at http://celery.readthedocs.org/en/latest/reference/celery.html#celery.Celery.config_from_object

pbhowmick
  • 1,093
  • 11
  • 26
  • Followed http://stackoverflow.com/questions/20116573/in-celery-3-1-making-django-periodic-task . Beat does find all the tasks in the django settings file on startup and it does execute the task but it never arrives at rabbitmq. Wish I could provide more info, there's not much to go by other than beat logging it sent the message and rabbitmq never getting it. – Pim May 13 '15 at 15:05
  • @Pim I would be interested in seeing how the worker is being invoked on the command-line – pbhowmick May 13 '15 at 16:20
  • I'm currently using "python manage.py celery beat -l debug" using djcelery build in commands. This command is run using supervisor to run beat as a daemon. Previously I used https://raw.githubusercontent.com/celery/celery/3.1/extra/generic-init.d/celerybeat which had the exact same results. – Pim May 13 '15 at 16:31