0

I work in scientific computing, and I am effectively forced to use conda to install certain other maintained packages to do my job. If I want to work on my own package, I need a way for it to play nice with both the conda dependency solver and pip. I would want to simply conda install the local package and use the conda dependency solver so that its compatible with the other software. However, I would also want to be able to otherwise pip install the package and/or upload it to PYPI.

Is there a way to develop a standardized python package (using, e.g., pyproject.toml and/or requirements.txt files) compatible with a conda environment? I have searched and haven't found a clear prescription on how to do so.

For conda, one could also locally specify the required dependencies in a *.yml file, but this option is not compatible with installation via pip. One would have to maintain dependencies in both a *.yml file as well as a requirements.txt file. This duplication results in manual maintenance and is error-prone.

Note that the conda develop command is officially supported by anaconda and on the surface looks like it could be used to address this problem; however, it is effectively deprecated and as of this writing doesn't seem to be supported on python 3.11.

Drphoton
  • 164
  • 9

1 Answers1

0

I have cobbled together a method that forgoes the *.yml file altogether. I keep all of the dependencies in a requirements.txt file (and all dev requirements in requirements-dev.txt) and use conda (or better yet mamba) to install them. Finally, I use pip to install the package itself.

Below is an example of the file structure for the project (inspired by this post):

Project/
|-- src/
|   |-- __init__.py
|   |-- main.py
|
|-- pyproject.toml
|-- requirements.txt
|-- requirements-dev.txt

Among the other metadata in pyproject.toml, we must specify that we want to dynamically read the requirements from requirements.txt, as well as any other optional dependencies we might want (e.g., requirements-dev.txt):

pyproject.toml

[project]
name = "myproject"
...
dynamic = ["dependencies", "optional-dependencies"]

[tool.setuptools.dynamic]
dependencies = { file = ["requirements.txt"] }

[tool.setuptools.dynamic.optional-dependencies]
dev = { file = ["requirements-dev.txt"] }

Installation with pip

The current structure works perfectly with pip, including either an editable install and/or optional dependencies:

# local installation of the package:
pip install .

# editable install:
pip install -e .

# editable install with optional dependencies:
pip install -e `.[dev]`

Installation with conda

Things are not as straightforward when using conda. When creating an environment from scratch:

conda create -n ENVNAME "python>=3.11" --file requirements.txt

If adding to an established environment, use update:

conda update --name ENVNAME --file requirements.txt

If installing a second set of dependencies (say, dev dependencies stored in requirements-dev.txt), you likely don't want them to affect the original dependencies (Otherwise, you would be testing a slightly different package, for example). You can freeze them using freeze-installed:

conda update --name ENVNAME --freeze-installed --file requirements-dev.txt

Finally, with all of the dependencies being taken care of by conda, the package can be installed to the environment using pip.

pip install --no-build-isolation --no-deps .

The --no-deps and --build-isolation flags are necessary to keep pip from meddling with the environment being maintained by conda.

Drphoton
  • 164
  • 9