Brief: I can run a dag by $ python my_dag.py
but by Airflow UI it claims the error No module named 'my_file_to_be_imported'
.
I have a container with a dags
and lutils
(custom folder from mine) git folders maped as volumes inside airflow home as:
── airflow_home
|──── dags
│ ├── __init__.py
| ├── my_dag.py
├──── lutils
├── __init__.py
├── my_file_to_be_imported.py
The my_dag.py
file inside dags
folder needs to read the content from lutils
folder.
The my_dag.py
(simplified) is defined as below:
import sys
sys.path.append('../')
from lutils import my_file_to_be_imported
def do_something():
my_file_to_be_imported.beauty_imported_method()
t1 = PythonOperator(
task_id='test_generate',
python_callable=do_something,
dag=dag)
my_file_to_be_imported.beauty_imported_method() #to check if python runs
print (my_file_to_be_imported.var) #to check if python runs
and the my_file_to_be_imported
file inside lutils
folder as:
def beauty_imported_method():
with open('text.txt', 'a') as f:
f.write("test")
var = "my test var"
If I run by bash $ python my_dag.py
(as a python script) it does execute the beauty_imported_method
fine and prints the var
variable.
But then inside airflow, there is a red warning saying: Broken DAG: [path_to_airflow_home/dags/my_dag.py] No module named 'my_file_to_be_imported'
How can I fix this to airflow
understand my import as python run
does?
I have read this very closed question in stackoverflow but dit not worked.
P.s.: this docker setup runs fine others dags which not rely on relative import.