I am experimenting with putting my Python code into the standard directory structure used for deployment with setup.py
and maybe PyPI. for a Python library called mylib it would be something like this:
mylibsrc/
README.rst
setup.py
bin/
some_script.py
mylib/
__init.py__
foo.py
There's often also a test/
subdirectory but I haven't tried writing unit tests yet. The recommendation to have scripts in a bin/
subdirectory can be found in the official Python packaging documentation.
Of course, the scripts start with code that looks like this:
#!/usr/bin/env python
from mylib.foo import something
something("bar")
This works well when it eventually comes to deploying the script (e.g. to devpi) and then installing it with pip. But if I run the script directly from the source directory, as I would while developing new changes to the library/script, I get this error:
ImportError: No module named 'mylib'
This is true even if the current working directory is the root mylibsrc/
and I ran the script by typing ./bin/some_script.py
. This is because Python starts searching for packages in the directory of the script being run (i.e. from bin/
), not the current working directory.
What is a good, permament way to make it easy to run scripts while developing packages?
Here is a relevant other question (especially comments to the first answer).
The solutions for this that I've found so far fall into three categories, but none of them are ideal:
- Manually fix up your Python's module search path somehow before running your scripts.
- You can manually add
mylibsrc
to myPYTHONPATH
environment variable. This seems to be the most official (Pythonic?) solution, but means that every time I check out a project I have to remember to manually change my environment before I can run any code in it. - Add
.
to the start of myPYTHONPATH
environment variable. As I understand it this could have some security problems. This would actually be my favoured trick if I was the only person to use my code, but I'm not, and I don't want to ask others to do this. - While looking at answers on the internet, for files in a
test/
directory I've seen recommendations that they all (indirectly) include a line of codesys.path.insert(0, os.path.abspath('..'))
(e.g. in structuring your project). Yuck! This seems like a bearable hack for files that are only for testing, but not those that will be installed with the package. - Edit: I have since found an alternative, which turns out to be in this category: by running the scripts with Python's
-m
script, the search path starts in the working directory instead of thebin/
directory. See my answer below for more details.
- You can manually add
- Install the package to a virtual environment before using it, using a setup.py (either running it directly or using pip).
- This seems like overkill if I'm just testing a change that I'm not sure is even syntactically correct yet. Some of the projects I'm working on aren't even meant to be installed as packages but I want to use the same directory structure for everything, and this would mean writing a setup.py just so I could test them!
- Edit: Two interesting variants of this are discussed in the answers below: the
setup.py develop
command in logc's answer andpip install -e
in mine. They avoid having to re-"install" for every little edit, but you still need to create asetup.py
for packages you never intend to fully install, and doesn't work very well with PyCharm (which has a menu entry to run thedevelop
command but no easy way to run the scripts that it copies to the virtual environment).
- Move the scripts to the project's root directory (i.e. in
mylibsrc/
instead ofmylibsrc/bin/
).- Yuck! This is a last resort
but, unfortunately, this seems like the only feasible option at the moment.
- Yuck! This is a last resort