2

It's possible to control/specify the Python environment via specific environment.yml and then using conda to create/activate it. However, for some projects, I might want to have a finer-grained control of environments in which Python code is executed.

For example, if I have 5 notebooks that have different (and potentially conflicting) dependencies. One way is to have multiple environment file definitions, which can also be controlled via nb_conda_kernels during interactive sessions, but is there a more elegant way to achieve this? (something that will avoid creation of multiple environment files)

There is a decorator in metaflow (https://docs.metaflow.org/metaflow/dependencies) that allows specifying dependencies for individual steps in the pipeline, however is there a way to achieve a similar result without metaflow?

SultanOrazbayev
  • 14,900
  • 3
  • 16
  • 46
  • What control do you want? What could be more fine-grained control than declaring in a configuration file exactly what dependencies you want? In what ways do you think environment control in your workflow is not "Pythonic" and what do you wish it would look like instead? – Matt Thompson Sep 15 '21 at 15:06
  • Thanks for the comments, I updated the question. I agree, having configuration file is exact and explicit (which is great), but what happens if I have multiple environments I would like to use within a single project? (there is the docker way, but perhaps there is something more light-weight) – SultanOrazbayev Sep 15 '21 at 15:46

0 Answers0