1

I have in Airflow a BashOperator that executes an aws s3 ls s3:/path command. It works on the console but it doesn't works on Airflow, the error message is: command not found . Aws is correctly installed, the following URL explains how the PATH was set: How do I resolve the "-bash: aws: command not found" awscli error? and how it was installed: Is it possible to install aws-cli package without root permission? .

aws --version
aws-cli/1.18.209 Python/3.8.5 Linux/5.4.0-58-generic botocore/1.19.49

I dont know what I am doing wrong. Please help. How can I make this command (and any other in aws) work on Airflow ?

Thanks in advance.

Jarvis
  • 8,494
  • 3
  • 27
  • 58
Israel Rodriguez
  • 425
  • 1
  • 6
  • 24
  • Airflow might be creating a non-interactive shell so you don't have direct access to the AWS binaries in the shell. Try running source ~/.bashrc or source .profile before executing the aws commands. – Nadim Younes Jan 06 '21 at 23:45
  • are you running airlfow in docker perhaps? – dstandish Jan 07 '21 at 06:49
  • @chorbs It is not runnning in docker , is in a ubuntu – Israel Rodriguez Jan 07 '21 at 10:34
  • 1
    add your operator instantiation code; show output of `which path` in the terminal immediately before running `airflow test ...` for the task in the same terminal. the env should propagate, unless you provide env explicitly in bash op. – dstandish Jan 07 '21 at 16:32

0 Answers0