For reference I've looked at the following links.
- Python Imports, Paths, Directories & Modules
- Importing modules from parent folder
- Importing modules from parent folder
- Python Imports, Paths, Directories & Modules
I understand that I'm doing is wrong and I'm trying to avoid relative path and changing things in via sys.path
as much as possible, though if those are my only options, please help me come up with a solution.
Note, here is an example of my current working directory structure. I think I should add a little more context. I started off adding __init__.py
to every directory so they would be considered packages and subpackages, but I'm not sure that is what I actually want.
myapp/
pack/
__init__.py
helper.py
runservice/
service1/
Dockerfile
service2/
install.py
Dockerfile
The only packages I will be calling exist in pack/
directory, so I believe that should be the only directory considered a package by python.
Next, the reason why this might get a little tricky, ultimately, this is just a service that builds various different containers. Where the entrypoints will live in python service*/install.py
where I cd
into the working directory of the script. The reason for this, I don't want container1 (service1)
to know about the codebase in service2
, as its irrelevant I would like and the code to be separated.
But, by running install.py
, I need to be able to do: from pack.helper import function
but clearly I am doing something wrong.
Can someone help me come up with a solution, so I can leave my entrypoint to my container as cd service2
, python install.py
.
Another important thing to note, within the script I have logic like:
if not os.path.isdir(os.path.expanduser(tmpDir))
I am hoping any solution we come up with, will not affect the logic here?
I apologize for the noob question.
EDIT:
Note, I I think I can do something like
sys.path.append(os.path.join(os.path.dirname(__file__), '..'))
But as far as I understand, that is bad practice....