0

I am trying to run simple task in celery. According to the need I have to add the tasks in queue and get their task_id and then later I have to execute the task using that task_id.

I tried this code but getting following error

Error on terminal

Traceback (most recent call last):
  File "c:\users\admin\appdata\local\programs\python\python38\lib\site-packages\billiard\pool.py", line 361, in workloop
    result = (True, prepare_result(fun(*args, **kwargs)))
  File "c:\users\admin\appdata\local\programs\python\python38\lib\site-packages\celery\app\trace.py", line 664, in fast_trace_task
    tasks, accept, hostname = _loc
ValueError: not enough values to unpack (expected 3, got 0)

Error stored in mongodb celery_taskmeta

{  "_id": "537073b1-e758-4315-b386-31a2c62b3b45",  "status": "FAILURE",  "result": "{\"exc_type\": \"ValueError\", \"exc_message\": [\"not enough values to unpack (expected 3, got 0)\"], \"exc_module\": \"builtins\"}",  "traceback": null,  "children": [],  "date_done": "2023-08-31T06:13:37.776662"}

code main.py

from celery import Celery

app = Celery("main", broker="amqp://guest:guest@localhost:5672", backend="mongodb://localhost:27017/celery")

@app.task
def add():
    return 3+4

I am running main.py using following command

celery -A main worker --loglevel=info

executor.py

from main import add
from celery.result import AsyncResult

result = add.apply_async()

task_id = result.id
print("Task ID:", task_id)

task_result = AsyncResult(task_id)

if task_result.ready():
    result_value = task_result.get()
    print("Task Result:", result_value)
else:
    print("Task is still processing")

I am getting the task_id and Task is still processing is getting printed. But the status of the task is failure due to the error

getting same error when using delay() and get()

from main import add

result = add.delay(3,4)

task_result = result.get()

print("Task Result:", task_result)
Veera Silamban
  • 223
  • 1
  • 14

0 Answers0