I want to run this task for every three minutes and this is what I have
tasks.py
@shared_task
def save_hackathon_to_db():
logger.info('ran')
loop = asyncio.get_event_loop()
statuses = ['ended', 'open', 'upcoming']
loop.run_until_complete(send_data(statuses))
logger.info('ended')
settings.py
CELERY_BEAT_SCHEDULE = {
"devpost_api_task": {
"task": "backend.tasks.save_hackathon_to_db",
"schedule": crontab(minute="*/3"),
},
}
However this doesn't get run every 3 minutes, it only gets run when I send a post request to http://0.0.0.0:8000/hackathon/task
followed by a get request to http://0.0.0.0:8000/hackathon/<task_id>
This is the code for the functions pertaining to the routes respectively
@csrf_exempt
def run_task(request):
if request.method == 'POST':
task = save_hackathon_to_db.delay()
return JsonResponse({"task_id": task.id}, status=202)
@csrf_exempt
def get_status(request, task_id):
print(task_id)
task_result = AsyncResult(task_id)
result = {
"task_id": task_id,
"task_status": task_result.status,
"task_result": task_result.result
}
return JsonResponse(result, status=200)
docker-compose.yml
version: "3.9"
services:
backend:
build: ./backend
ports:
- "8000:8000"
command: python3 manage.py runserver 0.0.0.0:8000
depends_on:
- db
- redis
celery:
build: ./backend
command: celery -A backend worker -l INFO --logfile=logs/celery.log
volumes:
- ./backend:/usr/src/app
depends_on:
- backend
- redis
celery-beat:
build: ./backend
command: celery -A backend beat -l info
volumes:
- ./backend/:/usr/src/app/
depends_on:
- redis
dashboard:
build: ./backend
command: flower -A backend --port=5555 --broker=redis://redis:6379/0
ports:
- 5555:5555
depends_on:
- backend
- redis
- celery
db:
image: postgres
volumes:
- postgres_data:/var/lib/postgresql/data
environment:
POSTGRES_DB: ${DB_NAME}
POSTGRES_USER: ${DB_USER}
POSTGRES_PASSWORD: ${DB_PASSWORD}
redis:
image: redis:6-alpine
volumes:
postgres_data:
EDIT #1 I doubt it is possible solution #1 because I do have the producer running
These are the initial logs I get from celery-beat
celery-beat_1 | celery beat v4.4.7 (cliffs) is starting.
celery-beat_1 | ERROR: Pidfile (celerybeat.pid) already exists.
celery-beat_1 | Seems we're already running? (pid: 1)
celery
logs
celery_1 | /usr/local/lib/python3.9/site-packages/celery/platforms.py:800: RuntimeWarning: You're running the worker with superuser privileges: this is
celery_1 | absolutely not recommended!
celery_1 |
celery_1 | Please specify a different user using the --uid option.
celery_1 |
celery_1 | User information: uid=0 euid=0 gid=0 egid=0
celery_1 |
celery_1 | warnings.warn(RuntimeWarning(ROOT_DISCOURAGED.format(
celery_1 |
celery_1 | -------------- celery@948692de8c79 v4.4.7 (cliffs)
celery_1 | --- ***** -----
celery_1 | -- ******* ---- Linux-5.10.25-linuxkit-x86_64-with 2021-09-10 19:59:53
celery_1 | - *** --- * ---
celery_1 | - ** ---------- [config]
celery_1 | - ** ---------- .> app: backend:0x7f6af44a8130
celery_1 | - ** ---------- .> transport: redis://redis:6379/0
celery_1 | - ** ---------- .> results: redis://redis:6379/0
celery_1 | - *** --- * --- .> concurrency: 4 (prefork)
celery_1 | -- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
celery_1 | --- ***** -----
celery_1 | -------------- [queues]
celery_1 | .> celery exchange=celery(direct) key=celery
celery_1 |
celery_1 |
celery_1 | [tasks]
celery_1 | . backend.tasks.save_hackathon_to_db
celery_1 |
2021-09-10 19:59:53,949: INFO/MainProcess] Connected to redis://redis:6379/0
[2021-09-10 19:59:53,960: INFO/MainProcess] mingle: searching for neighbors
[2021-09-10 19:59:54,988: INFO/MainProcess] mingle: all alone
[2021-09-10 19:59:55,019: WARNING/MainProcess] /usr/local/lib/python3.9/site-packages/celery/fixups/django.py:205: UserWarning: Using settings.DEBUG leads to a memory
leak, never use this setting in production environments!
warnings.warn('''Using settings.DEBUG leads to a memory
[2021-09-10 19:59:55,020: INFO/MainProcess] celery@948692de8c79 ready.
[2021-09-10 19:59:59,464: INFO/MainProcess] Events of group {task} enabled by remote.
dashboard
logs
dashboard_1 | [I 210910 19:59:54 command:135] Visit me at http://localhost:5555
dashboard_1 | [I 210910 19:59:54 command:142] Broker: redis://redis:6379/0
dashboard_1 | [I 210910 19:59:54 command:143] Registered tasks:
dashboard_1 | ['backend.tasks.save_hackathon_to_db',
dashboard_1 | 'celery.accumulate',
dashboard_1 | 'celery.backend_cleanup',
dashboard_1 | 'celery.chain',
dashboard_1 | 'celery.chord',
dashboard_1 | 'celery.chord_unlock',
dashboard_1 | 'celery.chunks',
dashboard_1 | 'celery.group',
dashboard_1 | 'celery.map',
dashboard_1 | 'celery.starmap']
dashboard_1 | [I 210910 19:59:54 mixins:229] Connected to redis://redis:6379/0
dashboard_1 | [W 210910 19:59:55 inspector:42] Inspect method reserved failed
dashboard_1 | [W 210910 19:59:55 inspector:42] Inspect method scheduled failed
dashboard_1 | [W 210910 19:59:55 inspector:42] Inspect method conf failed
dashboard_1 | [W 210910 19:59:55 inspector:42] Inspect method active_queues failed
dashboard_1 | [W 210910 19:59:55 inspector:42] Inspect method stats failed
dashboard_1 | [W 210910 19:59:55 inspector:42] Inspect method registered failed
dashboard_1 | [W 210910 19:59:55 inspector:42] Inspect method active failed
dashboard_1 | [W 210910 19:59:55 inspector:42] Inspect method revoked failed