14

We're running into a permission error when using Airflow, receiving the following error:

PermissionError: [Errno 13] Permission denied: '/usr/local/airflow/logs/scheduler/2019-12-18/../../../../home

We've tried using chmod 777 -R on the /usr/local/airflow/logs/schedule directory within the container but this doesn't seem to have done the trick.

We have this piece in our entrypoint.sh script:

export AIRFLOW__CORE__BASE_LOGS_FOLDER="/usr/local/airflow/logs

Has anyone else run into this airflow log permission issue? Can't seem to find much about this one in particular online.

phenderbender
  • 625
  • 2
  • 8
  • 18

8 Answers8

11

Folder permission that is bind mounted could also result in this error.

For example:

docker-compose.yml (pseudo code)

   service_name:
     ...
     volumes:
      - /home/user/airflow_logs:/opt/airflow/logs

Grant permission to the local folder, so that airflow container can write logs, create directory if needed etc.,

 sudo chmod u=rwx,g=rwx,o=rwx /home/user/airflow_logs
Sairam Krish
  • 10,158
  • 3
  • 55
  • 67
  • 5
    I had the same issue and I have solved it with the command: sudo chmod -R 777 /home/user/airflow_logs Also, it is important to mention that this can be applied to any folder one is trying to export from container. – Lucas Thimoteo Jan 14 '21 at 21:31
8

Just for anyone with the same issue...

Surprisingly, I had to take a look to the Airflow documentation... and according to it:

On Linux, the mounted volumes in container use the native Linux filesystem user/group permissions, so you have to make sure the container and host computer have matching file permissions.

mkdir ./dags ./logs ./plugins
echo -e "AIRFLOW_UID=$(id -u)\nAIRFLOW_GID=0" > .env

Once you have matched file permissions:

docker-compose up airflow-init
docker-compose up
rubbengimenez
  • 149
  • 1
  • 5
  • 2
    Thank you this worked for me. If you don't set AIFLOW_UID, all files will be created as root user which causes permission issues. – Adisesha Jan 27 '22 at 10:25
5

I solved the issue: in my case the problem was that the volume mounted folders, logs and dags didn't have write permission. I added it with

chmod -R 777 dags/
chmod -R 777 logs/

and in the docker-composer file they are mounted as

    volumes:
      - ./dags:/opt/bitnami/airflow/dags
      - ./logs:/opt/bitnami/airflow/logs
Galuoises
  • 2,630
  • 24
  • 30
2

I also have the same problem using Apache Airflow 1.10.7.

Traceback (most recent call last):
  File "/usr/lib/python3.7/multiprocessing/process.py", line 297, in _bootstrap
    self.run()
  File "/usr/lib/python3.7/multiprocessing/process.py", line 99, in run
    self._target(*self._args, **self._kwargs)
  File "/home/radifar/.virtualenvs/airflow/lib/python3.7/site-packages/airflow/jobs/scheduler_job.py", line 135, in _run_file_processor
    set_context(log, file_path)
  File "/home/radifar/.virtualenvs/airflow/lib/python3.7/site-packages/airflow/utils/log/logging_mixin.py", line 198, in set_context
    handler.set_context(value)
  File "/home/radifar/.virtualenvs/airflow/lib/python3.7/site-packages/airflow/utils/log/file_processor_handler.py", line 65, in set_context
    local_loc = self._init_file(filename)
  File "/home/radifar/.virtualenvs/airflow/lib/python3.7/site-packages/airflow/utils/log/file_processor_handler.py", line 148, in _init_file
    os.makedirs(directory)
  File "/home/radifar/.virtualenvs/airflow/lib/python3.7/os.py", line 211, in makedirs
    makedirs(head, exist_ok=exist_ok)
  File "/home/radifar/.virtualenvs/airflow/lib/python3.7/os.py", line 211, in makedirs
    makedirs(head, exist_ok=exist_ok)
  File "/home/radifar/.virtualenvs/airflow/lib/python3.7/os.py", line 211, in makedirs
    makedirs(head, exist_ok=exist_ok)
  [Previous line repeated 5 more times]
  File "/home/radifar/.virtualenvs/airflow/lib/python3.7/os.py", line 221, in makedirs
    mkdir(name, mode)
PermissionError: [Errno 13] Permission denied: '/media/radifar/radifar-dsl/Workflow/Airflow/airflow-home/logs/scheduler/2020-01-04/../../../../../../../home'

After checking how file_processor_handler.py works I find that the error was caused by the different directory location of example dag and our dag folder settings. In my case 7 folder above the folder 2020-01-04 is /media/radifar. In your case 4 folder above the folder 2019-12-18 is /usr/local. That's why the PermissionError was raised.

I was able to solve this problem by cleaning the AIRFLOW_HOME folder then run airflow version, set the load_example to False in airflow.cfg. Then run airflow initdb. After that I can use airflow without error.

Muhammad Radifar
  • 1,267
  • 1
  • 7
  • 8
  • 3
    For me load_examples = False was enough to fix the problem. – rwitzel Mar 18 '20 at 09:47
  • 1
    Turning off examples also resolved the error for me. – cdabel Mar 20 '20 at 16:12
  • 1
    I have set `load_examples = False` but I am still getting Permission denied error. `airflow inidb` and airflow UI everything worked perfectly till yesterday but today I am getting this error. Can someone please help. – mockash Jun 16 '20 at 04:51
  • @alex yes solved it. For me it was related to permission issues. I changed the permissions to 777 and that helped me. What is the error you are getting now? – mockash Aug 25 '20 at 11:50
  • ok interesting - so you changed permissions for /scheduler logs folder? I posted here my issue https://stackoverflow.com/questions/63510335/airflow-on-kubernetes-errno-13-permission-denied-opt-airflow-logs-schedule this is airflow running on kubernetes – alex Aug 25 '20 at 12:29
1

I had the same error.

PermissionError: [Errno 13] Permission denied: '/usr/local/airflow/logs/scheduler'

The reason I got that error is because I didn't create the initial 3 folders (dags, logs, plugins) before running airflow docker container. So docker seems to have created then automatically but the permissions were wrong.

Steps to fix:

  1. Stop current container
docker-compose down --volumes --remove-orphans
  1. Delete folders dags, logs, plugins
  2. Just in case Destroy the images and volumes already created (in Docker Desktop)
  3. Create folders again from command line
mkdir logs dags plugins
  1. run airflow docker again
docker-compose up airflow-init
docker-compose up
Dominik Sajovic
  • 603
  • 1
  • 8
  • 16
0

I was having the same problem running an Airflow image on docker hosted by Windows.

My solution was to override the CMD in the scheduler's dockerfile with a CMD that set the file permissions, before launching the default CMD.

The default CMD can be obtained with docker inspect -f '{{.Config.Cmd}}' <schedulerImageId>.

Example. I used the bitnami image ( docker.io/bitnami/airflow-scheduler:2.1.4-debian-10-r16 ). Inspecting the image I saw that the default CMD was

/opt/bitnami/scripts/airflow-scheduler/run.sh

So I created a run.sh script with the following content:

#! /bin/bash

chmod -R 777 /opt/bitnami/airflow/logs/
. /opt/bitnami/scripts/airflow-scheduler/run.sh

Then I added the following lines at the end of my dockerfile:

COPY run.sh /
RUN  chmod +x /run.sh

CMD /run.sh
Starnuto di topo
  • 3,215
  • 5
  • 32
  • 66
0

Little late to the party, but you could add a user to the default group, which creates the directory.

When your docker-compose is up you could run service docker-compose exec SERVICE_NAME bash and check to which group specific directory belongs to and then add this group to your user permission in docker-compose.yml:

service_name:
     ...
     user: USER_NAME:USER_GROUP
0

Another approach would be to copy the files into the image whilst also changing the ownership.

COPY --chown=airflow . .
Giorgos Myrianthous
  • 36,235
  • 20
  • 134
  • 156