2

I'm using Django 2.x and Celery 4.3.x

In my Django application, I'm using dotenv to serve environment variable from .env file and to load the environment variable, I have following script in the manage.py and wsgy.py files

env_file = os.path.join(os.path.dirname(os.path.realpath(__file__)), '.env')
dotenv.read_dotenv(env_file)

The environment variables have AWS credentials being used by anymail plugin to send mail using SES.

Now, I'm using a Celery task to send the email and running the celery worker from the command line using

celery -A myapp worker -l debug

The worker is running but when sending an email, it gives an error in the celery task as

ERROR/ForkPoolWorker-2] An error occurred (AccessDenied) when calling 
the SendRawEmail operation: User `arn:aws:iam::user_id:user/my-laptop` is not 
authorized to permorm this action...

It seems trying to connect with my laptop's user instead of using the credentials defined in the .env file.

How can I use the .env file to server the environment files to the Celery worker?

Anuj TBE
  • 9,198
  • 27
  • 136
  • 285
  • Take a look at [this similar question](https://stackoverflow.com/q/45894658/647002)—particularly the comments from Daniel Roseman. It sounds like the way you're starting Celery on the command line doesn't run any of your manage.py or wsgy.py code that would read dotenv. – medmunds Aug 16 '19 at 17:41

2 Answers2

1

Solved by loading the environment variable in the celery config file

celery.py

env_file = os.path.join(os.path.dirname(os.path.dirname(os.path.realpath(__file__))), '.env')
dotenv.read_dotenv(env_file)
Anuj TBE
  • 9,198
  • 27
  • 136
  • 285
0

Not sure how to do it with dotenv but I use python-decouple to pull parameters from a .env file in a few celery tasks.

from decouple import config

AUTH_USER = config('AUTH_USER')
AUTH_PASS = config('AUTH_PASS')
bdoubleu
  • 5,568
  • 2
  • 20
  • 53