2

The problem is I get the following log:

celery_1  | [2021-03-15 19:00:00,124: INFO/MainProcess] Scheduler: Sending due task read_dof (api.tasks.read_dof)
celery_1  | [2021-03-15 19:00:00,140: INFO/MainProcess] Scheduler: Sending due task read_bdm (api.tasks.read_bdm)
celery_1  | [2021-03-15 19:00:00,141: INFO/MainProcess] Scheduler: Sending due task read_fixer (api.tasks.read_fixer)

I have the following configuration for celery. Exchange is the name of my django project, which is where "celery.py" is and api is the name of my django app which is where my "tasks.py" is:

from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
from celery.schedules import crontab

os.environ.setdefault("DJANGO_SETTINGS_MODULE", "exchange.settings")

app = Celery("exchange")

app.config_from_object("django.conf:settings", namespace="CELERY")

app.autodiscover_tasks()

app.conf.beat_schedule = {
    'read_bdm': {
        'task': 'api.tasks.read_bdm',
        'schedule': crontab(hour=19,minute=0),
    },
    'read_dof': {
        'task': 'api.tasks.read_dof',
        'schedule': crontab(hour=19,minute=0),
    },
    'read_fixer': {
        'task': 'api.tasks.read_fixer',
        'schedule': crontab(hour=19,minute=0),
    },

}

Here is my tasks.py:

from celery import shared_task
from .models import BdmExch, DofExch, FixerExch
from .helpers.bdmcrawler import parse_bdm
from .helpers.dofcrawler import parse_dof
from .helpers.fixercrawler import parse_fixer

@shared_task(name='read_bdm')
def read_bdm():
    attempts=0
    while attempts <3:
        try:
            result = parse_bdm()
            print(result)
            BdmExch.objects.create(time=result["date"],exch=result["exc"])
            return
        except:
            attempts += 1
            print("Parsing error on read_bdm")
    print("--------------- Parsing error on read_bdm -----------")
    return    

@shared_task(name='read_dof')
def read_dof():
    attempts=0
    while attempts < 3:
        try:
            result = parse_dof()
            DofExch.objects.create(time=result["date"],exch=result["exc"])
            return
        except:
            attempts += 1
            print("Parsing error on read_dof")

    print("--------------- Parsing error on read_dof -----------")
    return

@shared_task(name='read_fixer')
def read_fixer():
    attempts=0
    while attempts < 3:
        try:
            result = parse_bdm()
            FixerExch.objects.create(time=result["date"],exch=result["exc"])
            return
        except:
            attempts += 1
            print("Parsing error on read_fixer")
    print("--------------- Parsing error on read_fixer -----------")
    return

as I said this is in the api django app, the parse_bdm, parse_dof and parse_fixer functions are simple implementations of requests and beautifulsoup or simple dictionaries in order to read the data from the different sources. No problems arise when I simply run the task functions as if they were simple functions so this leads me to believe there is a problem in my celery.py which I can't seem to pin down.

Any help is much appreciated.

Thank you very much!

paula.em.lafon
  • 553
  • 2
  • 6
  • 15

2 Answers2

1

It seems that the shared_task decorator does not play well with periodic tasks. I was looking into why that is and came across this post, the problem described here seems related. I would recommend using the @app.task decorator instead and see if it solves the issue.

Julien
  • 61
  • 6
0

Have you tried adding the init file for the Django application?

exchange/__init__.py

from __future__ import absolute_import, unicode_literals

# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app

__all__ = ('celery_app',)

As far as I can tell, the documentation is sometimes misleading:

The @shared_task decorator lets you create tasks without having any concrete app instance:

Rohith Samuel
  • 301
  • 4
  • 10