0

I'm setting up celeryd and redis on Elastic Beanstalk as a daemon, per How do you run a worker with AWS Elastic Beanstalk?.

I think I'm almost there. Redis is running and accessible on elasticcache.

But the .ebextesions file to setup celeryd is failing on a specific line. See entire script I'm running in .ebextensions further below.

Line: command=/opt/python/run/venv/bin/celery worker -A myappname --loglevel=INFO.

I've tried a few different variations of the command.

Tried:

  • command=/opt/python/run/venv/bin/celery worker -A /opt/python/current/app/app.py --loglevel=INFO (directory to my app.py)

  • command=/opt/python/run/venv/bin/celery worker -A app.py --loglevel=INFO (name of my app.py)

  • command=/opt/python/run/venv/bin/celery worker -A app --loglevel=INFO (name without extension)

  • command=/opt/python/run/venv/bin/celery worker -A flask-celery-01 --loglevel=INFO (app name in elastic beanstalk)

  • command=/opt/python/run/venv/bin/celery worker -A myappname --loglevel=INFO (original unedited)

Traceback in celery logs:

Traceback (most recent call last):
  File "/opt/python/run/venv/bin/celery", line 11, in <module>
    sys.exit(main())
  File "/opt/python/run/venv/local/lib/python3.6/site-packages/celery/__main__.py", line 30, in main
    main()
  File "/opt/python/run/venv/local/lib/python3.6/site-packages/celery/bin/celery.py", line 81, in main
    cmd.execute_from_commandline(argv)
  File "/opt/python/run/venv/local/lib/python3.6/site-packages/celery/bin/celery.py", line 769, in execute_from_commandline
    super(CeleryCommand, self).execute_from_commandline(argv)))
  File "/opt/python/run/venv/local/lib/python3.6/site-packages/celery/bin/base.py", line 305, in execute_from_commandline
    argv = self.setup_app_from_commandline(argv)
  File "/opt/python/run/venv/local/lib/python3.6/site-packages/celery/bin/base.py", line 465, in setup_app_from_commandline
    self.app = self.find_app(app)
  File "/opt/python/run/venv/local/lib/python3.6/site-packages/celery/bin/base.py", line 485, in find_app
    return find_app(app, symbol_by_name=self.symbol_by_name)
  File "/opt/python/run/venv/local/lib/python3.6/site-packages/celery/app/utils.py", line 229, in find_app
    sym = symbol_by_name(app, imp=imp)
  File "/opt/python/run/venv/local/lib/python3.6/site-packages/celery/bin/base.py", line 488, in symbol_by_name
    return symbol_by_name(name, imp=imp)
  File "/opt/python/run/venv/local/lib/python3.6/site-packages/kombu/utils/__init__.py", line 96, in symbol_by_name
    module = imp(module_name, package=package, **kwargs)
  File "/opt/python/run/venv/local/lib/python3.6/site-packages/celery/utils/imports.py", line 101, in import_from_cwd
    return imp(module, package=package)
  File "/opt/python/run/venv/lib64/python3.6/importlib/__init__.py", line 126, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 978, in _gcd_import
  File "<frozen importlib._bootstrap>", line 961, in _find_and_load
  File "<frozen importlib._bootstrap>", line 948, in _find_and_load_unlocked
ModuleNotFoundError: No module named '/opt/python/current/app/app'

.ebextensions script:

files:
  "/opt/elasticbeanstalk/hooks/appdeploy/post/run_supervised_celeryd.sh":
    mode: "000755"
    owner: root
    group: root
    content: |
      #!/usr/bin/env bash

      # Get django environment variables
      celeryenv=`cat /opt/python/current/env | tr '\n' ',' | sed 's/export //g' | sed 's/$PATH/%(ENV_PATH)s/g' | sed 's/$PYTHONPATH//g' | sed 's/$LD_LIBRARY_PATH//g'`
      celeryenv=${celeryenv%?}

      # Create celery configuraiton script
      celeryconf="[program:celeryd]
      ; Set full path to celery program if using virtualenv
      command=/opt/python/run/venv/bin/celery worker -A /opt/python/current/app/app.py --loglevel=INFO

      directory=/opt/python/current/app
      user=nobody
      numprocs=1
      stdout_logfile=/var/log/celery-worker.log
      stderr_logfile=/var/log/celery-worker.log
      autostart=true
      autorestart=true
      startsecs=10

      ; Need to wait for currently executing tasks to finish at shutdown.
      ; Increase this if you have very long running tasks.
      stopwaitsecs = 600

      ; When resorting to send SIGKILL to the program to terminate it
      ; send SIGKILL to its whole process group instead,
      ; taking care of its children as well.
      killasgroup=true

      ; if rabbitmq is supervised, set its priority higher
      ; so it starts first
      priority=998

      environment=$celeryenv"

      # Create the celery supervisord conf script
      echo "$celeryconf" | tee /opt/python/etc/celery.conf

      # Add configuration script to supervisord conf (if not there already)
      if ! grep -Fxq "[include]" /opt/python/etc/supervisord.conf
          then
          echo "[include]" | tee -a /opt/python/etc/supervisord.conf
          echo "files: celery.conf" | tee -a /opt/python/etc/supervisord.conf
      fi

      # Reread the supervisord config
      supervisorctl -c /opt/python/etc/supervisord.conf reread

      # Update supervisord in cache without restarting all services
      supervisorctl -c /opt/python/etc/supervisord.conf update

      # Start/Restart celeryd through supervisord
      supervisorctl -c /opt/python/etc/supervisord.conf restart celeryd

It could be because my flask app is not split into modules (everything is in app.py). And I haven't "installed" the app as I've seen in some tutorials. Just deployed with EB and the web UI works.

Flask and AWS are both new to me. And would be happy to get it into production any way I can.

Update Per Comments:

Tried:

/opt/python/run/venv/bin/celery worker -A app:app --loglevel=INFO

Traceback (most recent call last):
  File "/opt/python/run/venv/bin/celery", line 11, in <module>
    sys.exit(main())
  File "/opt/python/run/venv/local/lib/python3.6/site-packages/celery/__main__.py", line 30, in main
    main()
  File "/opt/python/run/venv/local/lib/python3.6/site-packages/celery/bin/celery.py", line 81, in main
    cmd.execute_from_commandline(argv)
  File "/opt/python/run/venv/local/lib/python3.6/site-packages/celery/bin/celery.py", line 769, in execute_from_commandline
    super(CeleryCommand, self).execute_from_commandline(argv)))
  File "/opt/python/run/venv/local/lib/python3.6/site-packages/celery/bin/base.py", line 305, in execute_from_commandline
    argv = self.setup_app_from_commandline(argv)
  File "/opt/python/run/venv/local/lib/python3.6/site-packages/celery/bin/base.py", line 473, in setup_app_from_commandline
    user_preload = tuple(self.app.user_options['preload'] or ())
AttributeError: 'Flask' object has no attribute 'user_options'

Tried:

/opt/python/run/venv/bin/celery worker -A app:app.celery --loglevel=INFO

Traceback (most recent call last):
  File "/opt/python/run/venv/bin/celery", line 11, in <module>
    sys.exit(main())
  File "/opt/python/run/venv/local/lib/python3.6/site-packages/celery/__main__.py", line 30, in main
    main()
  File "/opt/python/run/venv/local/lib/python3.6/site-packages/celery/bin/celery.py", line 81, in main
    cmd.execute_from_commandline(argv)
  File "/opt/python/run/venv/local/lib/python3.6/site-packages/celery/bin/celery.py", line 769, in execute_from_commandline
    super(CeleryCommand, self).execute_from_commandline(argv)))
  File "/opt/python/run/venv/local/lib/python3.6/site-packages/celery/bin/base.py", line 305, in execute_from_commandline
    argv = self.setup_app_from_commandline(argv)
  File "/opt/python/run/venv/local/lib/python3.6/site-packages/celery/bin/base.py", line 465, in setup_app_from_commandline
    self.app = self.find_app(app)
  File "/opt/python/run/venv/local/lib/python3.6/site-packages/celery/bin/base.py", line 485, in find_app
    return find_app(app, symbol_by_name=self.symbol_by_name)
  File "/opt/python/run/venv/local/lib/python3.6/site-packages/celery/app/utils.py", line 232, in find_app
    sym = imp(app)
  File "/opt/python/run/venv/local/lib/python3.6/site-packages/celery/utils/imports.py", line 101, in import_from_cwd
    return imp(module, package=package)
  File "/opt/python/run/venv/lib64/python3.6/importlib/__init__.py", line 126, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 978, in _gcd_import
  File "<frozen importlib._bootstrap>", line 961, in _find_and_load
  File "<frozen importlib._bootstrap>", line 936, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
  File "<frozen importlib._bootstrap>", line 978, in _gcd_import
  File "<frozen importlib._bootstrap>", line 961, in _find_and_load
  File "<frozen importlib._bootstrap>", line 948, in _find_and_load_unlocked
ModuleNotFoundError: No module named 'app:app'

Locally I run:

celery worker -A app.celery --loglevel=info

And this works.

app.py

import os
import random
import time
from flask import Flask, request, render_template, session, flash, redirect, \
    url_for, jsonify
from flask.ext.mail import Mail, Message
from celery import Celery
import psycopg2 as pg

app = Flask(__name__)

REDIS_CLUSTER = os.environ.get('CLUSTER_URL')


# celery config
app.config['CELERY_BROKER_URL'] = REDIS_CLUSTER
app.config['CELERY_RESULT_BACKEND'] = REDIS_CLUSTER

# initialize extensions
mail = Mail(app)

# initialize celery
celery = Celery(app.name, broker=app.config['CELERY_BROKER_URL'])
celery.conf.update(app.config)


DB_NAME = os.environ.get('DB_NAME')
DB_USER = os.environ.get('DB_USER')
DB_PASSWORD = os.environ.get('DB_PASSWORD')
DB_HOST = os.environ.get('DB_HOST')

@celery.task
def do_sth_to_db():

    """insert record into db"""
    with app.app_context():

        # local: fintrosapp_ml
        conn = pg.connect(
            dbname=DB_NAME,
            user=DB_USER,
            password=DB_PASSWORD,
            host=DB_HOST
        )

        cursor = conn.cursor()

        query = """
            INSERT INTO examples(first_name, last_name)
            VALUES ('First', 'Last');
        """
        cursor.execute(query)
        conn.commit()


@app.route('/', methods=['GET', 'POST'])
def index():
    if request.method == 'GET':
        return render_template('index.html', email=session.get('email', ''))

    # backgrounded task
    do_sth_to_db.delay()

    return redirect(url_for('index'))


###############################################################################
if __name__ == '__main__':
    app.run() # debug=True
tim_xyz
  • 11,573
  • 17
  • 52
  • 97

1 Answers1

1

You need to use it like below

celery worker -A app:celery --loglevel=info

The A.B.C:D syntax means import the A.B.C and use the D object from the same

Tarun Lalwani
  • 142,312
  • 9
  • 204
  • 265
  • Hi @TarunLalwani, please can you check my issue? https://stackoverflow.com/questions/50046825/issues-with-celery-configuration-on-aws-elastic-beanstalk-no-config-updates-t – Fabio Apr 30 '18 at 17:42