10

I am trying to check celery results from the command line but get a No Result Backend Configured error. I have setup redis as my result backend and am now at a loss.

I have the celery app setup like so:

qflow/celery.py:

os.environ.setdefault('CELERY_CONFIG_MODULE', 'qflow.celeryconfig')

app = Celery(
    'qflow',
    include=['qflow.tasks']
)
app.config_from_envvar('CELERY_CONFIG_MODULE')

The config module (qflow/celeryconfig.py) looks like so:

broker_url = 'redis://localhost:6379/0'
result_backend = 'redis://localhost:6379/0'
result_persistent = True
task_result_expires = None
send_events = True

The celery worker starts fine:

$ celery -A qflow worker -E -l info

 -------------- celery@james-laptop v4.0.2 (latentcall)
---- **** ----- 
--- * ***  * -- Linux-4.8.0-52-generic-x86_64-with-debian-stretch-sid 2017-07-21 14:22:34
-- * - **** --- 
- ** ---------- [config]
- ** ---------- .> app:         qflow:0x7fcbde317f28
- ** ---------- .> transport:   redis://localhost:6379/0
- ** ---------- .> results:     redis://localhost:6379/0
- *** --- * --- .> concurrency: 4 (prefork)
-- ******* ---- .> task events: ON
--- ***** ----- 
 -------------- [queues]
                .> celery           exchange=celery(direct) key=celery

It seems to indicate that results are configured.

I import the tasks and start them from my webapp (based on falcon, running with gunicorn), but when I try to query it on the command line with celery result <task_id>, I get:

Traceback (most recent call last):
  File "/home/james/miniconda3/envs/qflow/bin/celery", line 11, in <module>
    load_entry_point('celery', 'console_scripts', 'celery')()
  File "/home/james/miniconda3/envs/qflow/lib/python3.6/site-packages/celery/__main__.py", line 14, in main
    _main()
  File "/home/james/miniconda3/envs/qflow/lib/python3.6/site-packages/celery/bin/celery.py", line 326, in main
    cmd.execute_from_commandline(argv)
  File "/home/james/miniconda3/envs/qflow/lib/python3.6/site-packages/celery/bin/celery.py", line 488, in execute_from_commandline
    super(CeleryCommand, self).execute_from_commandline(argv)))
  File "/home/james/miniconda3/envs/qflow/lib/python3.6/site-packages/celery/bin/base.py", line 281, in execute_from_commandline
    return self.handle_argv(self.prog_name, argv[1:])
  File "/home/james/miniconda3/envs/qflow/lib/python3.6/site-packages/celery/bin/celery.py", line 480, in handle_argv
    return self.execute(command, argv)
  File "/home/james/miniconda3/envs/qflow/lib/python3.6/site-packages/celery/bin/celery.py", line 412, in execute
    ).run_from_argv(self.prog_name, argv[1:], command=argv[0])
  File "/home/james/miniconda3/envs/qflow/lib/python3.6/site-packages/celery/bin/base.py", line 285, in run_from_argv
    sys.argv if argv is None else argv, command)
  File "/home/james/miniconda3/envs/qflow/lib/python3.6/site-packages/celery/bin/base.py", line 368, in handle_argv
    return self(*args, **options)
  File "/home/james/miniconda3/envs/qflow/lib/python3.6/site-packages/celery/bin/base.py", line 244, in __call__
    ret = self.run(*args, **kwargs)
  File "/home/james/miniconda3/envs/qflow/lib/python3.6/site-packages/celery/bin/result.py", line 40, in run
    value = task_result.get()
  File "/home/james/miniconda3/envs/qflow/lib/python3.6/site-packages/celery/result.py", line 189, in get
    on_message=on_message,
  File "/home/james/miniconda3/envs/qflow/lib/python3.6/site-packages/celery/backends/base.py", line 466, in wait_for_pending
    no_ack=no_ack,
  File "/home/james/miniconda3/envs/qflow/lib/python3.6/site-packages/celery/backends/base.py", line 772, in _is_disabled
    raise NotImplementedError(E_NO_BACKEND.strip())
NotImplementedError: No result backend is configured.
Please see the documentation for more information.

I'm running on linux (4.8.0-52-generic)

jramm
  • 6,415
  • 4
  • 34
  • 73

1 Answers1

1

As no app name is provided in the command line args, celery fails to pick up the configuration values provided. So it uses default configuration values and by default backend is disabled. If anyone is still facing this issue, try invoking it with:

celery -A qflow result <task_id>

Lemon Reddy
  • 553
  • 3
  • 5