I've seen lots of Python projects on GitHub which use really nice and clean imports, where they define their types in the root of the project, and import them from sub-packages within.
For example, you could have a directory structure like this:
project
- foo
- foo.py
- bar
- baz
- baz.py
- bar.py
main.py
types.py
There may be some class in types.py
that is a core data type within the project.
# types.py
from dataclasses import dataclass
@dataclass(frozen=True)
class ImportantType:
"""This is a core type within the project that's used by sub-packages."""
foo: str
bar: str
Then, interestingly, within either sub-package you find code like this:
# foo/foo.py
from project import types
def example():
f = types.ImportantType("foo", "bar")
# does some processing on f.
From what I've learned, Python imports do not usually work like this. However, it seems really clean to be able to have your data models in the root of your project, and be able to import packages and modules relative to the root of the project.
How do you set up your project in order to be able to use imports like this?
An example of a project using this is thefuck, here's a link to the file importing from root.
Disclaimer: I have searched the code, and nowhere are they adding packages to PYTHONPATH
, nor are they using the sys.path
approach.