I have created a python package. In the __init__.py
there is a line that reads
from ._constants import initial_time, final_time, time_step
When these variables are imported from the module _constants.py
the code in the in this file is executed which reads some data from a folder that is located next to the package (i.e. some data in this file is read using relative paths like '../data/file.csv'
.
The problem is that I am importing this python package from a script outside of it which causes a FileNotFoundError
since these paths are now relative to where the package is imported not where the file is that is reading that data.
I could see this type of setup being a common use case where some reference datasets need to be loaded when a package is imported. For example, in statsmodels
there are reference statistical datasets that can be used to test some of the functionality of the package.
What is the "proper" way to load up data in a python package so that when the package is imported the files are found and the relative paths used to find them still work?
Here is an example directory structure to illustrate what I mentioned above:
project/
data/
file.csv
package/
__init__.py
main.py
Note: I know that I can change the paths some that they are relative to main.py
. I've already done that and need a more general solution now that I am writing tests and need to import some things to a different location from the package.