4

I am trying to add a .gitlab-ci.yml file to my gitlab project; the file looks like:

image: continuumio/miniconda3:latest

before_script:
  - conda env create -f environment.yml
  - conda activate py3p10 
  - export MY_PROJECT_ROOT=$PWD
  - export PYTHONPATH+=:$PWD
tests:
  stage: test
  script:
    - pytest tests -W ignore::DeprecationWarning

now, environment.yml contains about 30 packages and when I push to a branch the jobs seem to be downloading and installing all the packages. This is making the jobs take about 10 minutes and it seems pretty wasteful. Is there a way to tell gitlab to cache that conda environment so that it gets reused?

From:

https://docs.gitlab.com/ee/ci/caching/#cache-python-dependencies

it seems that we can cache, but only for virtualenv not conda. From:

Caching virtual environment for gitlab-ci

the top answer discourages caching with conda.

Cheers.

I am expecting to be able to cache the environment and the full job should test around 20 seconds.

acampove
  • 41
  • 1

1 Answers1

0

Here is how I handle things.

It is important to note that your environment needs to be kept inside your project directory. This means you need to use the -p option from Conda rather than -n.

The example below mimics closely the example in the Gitlab docs.

The || true is required as Conda does not have a flag that I am aware of that will skip the install if the environment exists when using -p.

default:
  image: condaforge/miniforge-pypy3
  cache:
    key:
      files:
        - environment.yml
    paths:
      - .env/
  before_script:
    - conda env create -f environment.yml -p .env/ || true
    - shopt -s expand_aliases
    - alias envrun="conda run -p .env/"
    - envrun python -V

pytest:
  stage: test
  script:
    - envrun python -m pytest --junitxml=report.xml
  artifacts:
    when: always
    reports:
      junit: report.xml
Zac Siegel
  • 147
  • 1
  • 5