I have a setup where airflow is running in docker container and has it's logs directory mapped to directory in host. While running, it creates a file system link /usr/local/airflow/logs/scheduler/latest
that has a valid target inside a container, but not in host. Next time running docker-compose build
it trips on that link and fails the build.
I've tried ignoring logs directory in .dockerignore, but that did not change anything.
Relevant parts of my setup are as follows:
Working directory
***/docker/
airflow/
logs/
scheduler/
...
2018-01-30/
latest -> /usr/local/airflow/logs/scheduler/2018-01-30
.dockerignore
Dockerfile
...
docker-compose.yml
...
docker-compose.yml
version: '2'
services:
...
airflow:
...
build:
dockerfile: airflow/Dockerfile
context: .
environment:
- LOAD_EX=n
- EXECUTOR=Local
volumes:
- ./airflow/logs:/usr/local/airflow/logs/
...
Dockerfile
FROM puckel/docker-airflow:1.8.1
USER root
...
RUN chown -R airflow: /usr/local/airflow
USER airflow
...
.dockerignore
./logs
...
Error I get in terminal:
$ pwd
/***/docker
$ docker-compose build
Building airflow
Traceback (most recent call last):
File "/usr/local/bin/docker-compose", line 11, in <module>
sys.exit(main())
File "/usr/local/lib/python2.7/dist-packages/compose/cli/main.py", line 71, in main
command()
File "/usr/local/lib/python2.7/dist-packages/compose/cli/main.py", line 124, in perform_command
handler(command, command_options)
File "/usr/local/lib/python2.7/dist-packages/compose/cli/main.py", line 254, in build
build_args=build_args)
File "/usr/local/lib/python2.7/dist-packages/compose/project.py", line 364, in build
service.build(no_cache, pull, force_rm, memory, build_args)
File "/usr/local/lib/python2.7/dist-packages/compose/service.py", line 967, in build
'memory': parse_bytes(memory) if memory else None
File "/usr/local/lib/python2.7/dist-packages/docker/api/build.py", line 150, in build
path, exclude=exclude, dockerfile=dockerfile, gzip=gzip
File "/usr/local/lib/python2.7/dist-packages/docker/utils/build.py", line 14, in tar
root=root, fileobj=fileobj, gzip=gzip
File "/usr/local/lib/python2.7/dist-packages/docker/utils/utils.py", line 103, in create_archive
'Can not access file in context: {}'.format(full_path)
IOError: Can not access file in context: /***/docker/airflow/logs/scheduler/latest
If I delete contents of airflow/logs/
before running docker-compose build
everything works and airflow creates same link again and I have to keep deleting it.
I'd like docker-compose to completely ignore airflow/logs
directory. And ideally, in host, to not have a file system link airflow created from inside container.
EDIT in response to my question getting flagged as a potential duplicate of Mount host directory with a symbolic link inside in docker container
- I am not trying to make a container work with a link in the host. Container creates the link (although I agree root cause for link being broken in host would be same for both questions);
- I don't want the link to work on both container and host. I don't really care that host has a broken symlink. I want that link, or ideally entire
airflow/logs
directory to be ignored by docker; - I can't just create a dummy directory on host to cheat my way out of this because airflow will change target of that link every day;
- Accepted answer in the linked question does not give a solution all it says is "don't use symlinks with targets that would have different path on host and container". I'm not in control of that symlink. Airflow is.