I have several sets of utility functions that I have written to analyze scientific data. For example, for making plots I might have a file called plotting_tools.py
For several different projects in several different home directories and virtual environments, I might want to use functions from plotting_tools.py
One way to do this is to have a copy of plotting_tools.py
in every project directory, and then just run
from plotting_tools.py import *
at the top of my workflow (for example, in a Jupyter notebook).
However, this approach has limitations because when I discover a bug in plotting_tools.py
I have to manually update every local copy. Another option is to have a single directory on my computer from which I import the module using importlib
from importlib.machinery import SourceFileLoader
foo = SourceFileLoader("plotting_tools","/Users/me/plotting_tools.py").load_module()
from plotting_tools.py import *
This is a little frail because it has a hard coded directory, and tools like autoreload()
break.
I was curious if there is a way to deal with this problem more elegantly, where I have some sort of local directory with all of my utility function files that can be treated as a package (as in, when I make a new project and virtual environment I can just install the current version of plotting_tools
to that environment, and then manually update it if the master version changes). Is the best way to do this just to use a GitHub repository for all of my functions, or is there a way to do this purely locally?