0

I am trying to setup AirFlow in WSL and running into problems. I have gone through articles and SO but not able to resolve this.

I am following the Airlfow tutorial https://airflow.apache.org/docs/stable/tutorial.html to get setup. I am able to run the run the final script in the tutorial above without errors. Basically this piece of code

from airflow import DAG
from airflow.operators.bash_operator import BashOperator
from datetime import datetime, timedelta


default_args = {
    'owner': 'Airflow',
    'depends_on_past': False,
    'start_date': datetime(2015, 6, 1),
    'email': ['airflow@example.com'],
    'email_on_failure': False,
    'email_on_retry': False,
    'retries': 1,
    'retry_delay': timedelta(minutes=5),
    # 'queue': 'bash_queue',
    # 'pool': 'backfill',
    # 'priority_weight': 10,
    # 'end_date': datetime(2016, 1, 1),
}

dag = DAG(
    'tutorial', default_args=default_args, schedule_interval=timedelta(days=1))

# t1, t2 and t3 are examples of tasks created by instantiating operators
t1 = BashOperator(
    task_id='print_date',
    bash_command='date',
    dag=dag)

t2 = BashOperator(
    task_id='sleep',
    bash_command='sleep 5',
    retries=3,
    dag=dag)

templated_command = """
    {% for i in range(5) %}
        echo "{{ ds }}"
        echo "{{ macros.ds_add(ds, 7)}}"
        echo "{{ params.my_param }}"
    {% endfor %}
"""

t3 = BashOperator(
    task_id='templated',
    bash_command=templated_command,
    params={'my_param': 'Parameter I passed in'},
    dag=dag)

t2.set_upstream(t1)
t3.set_upstream(t1)

I am then able to execute it without errors using

python ~/airflow/dags/FirstTest.py

However when I try to run the command

airflow list_dags

I get an error saying

airflow: command not found

I was able to do a pip install apache-airflow and therefore the original script could run.

However, according to this SO forum question (though a different platform) Getting bash: airflow: command not found

a SUDO pip install pip fixed it, but in my case it gives me an error saying

'/home/sum/.cache/pip' or its parent directory is not owned by the current user and caching wheels has been disabled

I have then used the -H flag meaning

sudo -H pip3 install apache-airflow

But it seemed it was not hitting fiddler (127.0.0.1:8888) which I have as proxy as thus got a connection refused error

Therefore I modified it to include proxy explicitly using

sudo pip3 --proxy=http://127.0.0.1:8888 install apache-airflow

But then I ran into the issue that all the issues that it went to pypi and not npm.sys.dom which I wanted.

In my /etc/profile file I have

export http_proxy=http://127.0.0.1:8888
export https_proxy=http://127.0.0.1:8888
export pip_index_url=http://npm.sys.dom:81/pypi/Python/simple
export pip_trusted_host=npm.sys.dom

But somehow it seems VS code does not pick it up.

I tried setting the ".profile" file within VS code which came up after I chose to open the folder corresponding to my home directory, using the same 4 lines as above.

But still the sudo pip install would not pick it up .

Finally I got around by doing

sudo pip3 --proxy=http://127.0.0.1:8888 trusted-host npm.sys.dom index-url http://npm.sys.dom:81/pypi/Python/simple install apache-airflow

But now when I run

airflow list_dags

I get the dags that I have but I also get a bunch of errors before getting the output

ERROR - Failed on pre-execution callback using <function default_action_log at 0x7fa84c2db510>
Traceback (most recent call last):
  File "/home/sum/.local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1246, in _execute_context
    cursor, statement, parameters, context
  File "/home/sum/.local/lib/python3.6/site-packages/sqlalchemy/engine/default.py", line 581, in do_execute
    cursor.execute(statement, parameters)
sqlite3.OperationalError: no such table: log

......
(Background on this error at: http://sqlalche.me/e/e3q8)

When I look at http://sqlalche.me/e/e3q8, it shows something related to connection not being released which I will investigate further.

My question is: All the workarounds that I had to do for installing airflow , is there a cleaner way and what did I miss? Just, so that I do not have keep doing this same stuff again and again.

P.S: Not super well versed in Linux. Using Ubuntu 18.04 for Windows.

Saugat Mukherjee
  • 778
  • 8
  • 32
  • FYI you should never execute pip using `sudo`. It is much better and safer to use a virtual environment, else you run the risk of breaking your Python installation (and your OS if it relies on Python in any way). – Brett Cannon Dec 11 '19 at 20:06

1 Answers1

0

Finally got around this one by:

  1. Added http and https proxies to bashrc file so that I do not have to specify it every time I use pip3 within VSCode. Fiddler is the proxy here.
echo 'export HTTP_PROXY="127.0.0.1:8888"' >> ~/.bashrc
echo 'export HTTPS_PROXY="127.0.0.1:8888"' >> ~/.bashrc
  1. Creating virtual environment in python and installing apache-airflow within the virtual environments
apt-get install python3-venv


#Please replace or choose your own path. The one below is a sample
python3 -m venv /home/usrname/airflow
source /home/usrname/airflow/bin/activate

#finally install airflow. Remember to install setuptools before. I also had other packages to install

pip install setuptools_git "pymssql<3.0" pyodbc
pip install apache-airflow
Saugat Mukherjee
  • 778
  • 8
  • 32