I am setting up an AWS MWAA instance.
I have custom operators which themselves reference other python files. I followed the directory structure suggested here (by astronomer.io) and am able to deploy my airflow environment locally without issue.
However, when moving my code base to the S3 bucket, the AWS service isn't able to find my custom operators.
My file structure on the S3 bucket looks like this:
- s3://{my-bucket-name}
- dags
- <dag 1>.py
- etc...
- requirements.txt
- plugins.zip
- dags
And my plugins.zip file structure looks like
- plugins.zip
- libs
- <my custom lib 1>.py
- etc...
- operators
- <operator that imports custom lib 1>.py
- etc...
- libs
However I am getting the error:
ModuleNotFoundError: No module named 'operators'
from python once the service starts.
I know some docs use the airflow "plugin" structure for these additional modules, but that seems unnecessary and is even recommended against in the first link I shared here:
...According to the Airflow documentation, [plugins] can be added using Airflow’s Plugins mechanism. This however, overcomplicates the issue and leads to confusion for many people. Airflow is even considering deprecating using the Plugins mechanism for hooks and operators going forward.
Does anyone know what file structure to use for importing simple custom python modules and operators in MWAA?