35

When organising python project, this structure seems to be a standard way of doing it:

myproject\
    bin\
        myscript
    mypackage\
        __init__.py
        core.py
    tests\
        __init__.py
        mypackage_tests.py
setup.py

My question is, how do I import my core.py so I can use it in myscript?

both __init__.py files are empty.

Content of myscript:

#!/usr/bin/env python
from mypackage import core
if __name__ == '__main__':
    core.main()

Content of core.py

def main():
    print 'hello'

When I run myscript from inside myproject directory, I get the following error:

Traceback (most recent call last):
  File "bin/myscript", line 2, in <module>
    from mypackage import core
ImportError: No module named mypackage

What am I missing?

Rabarberski
  • 23,854
  • 21
  • 74
  • 96
tim_wonil
  • 14,970
  • 6
  • 29
  • 42

4 Answers4

7

Usually, setup.py should install the package in a place where the Python interpreter can find it, so after installation import mypackage will work. To facilitate running the scripts in bin right from the development tree, I'd usually simply add a simlink to ../mypackage/ to the bin directory. Of course, this requires a filesystem supporting symlinks…

mgilson
  • 300,191
  • 65
  • 633
  • 696
Sven Marnach
  • 574,206
  • 118
  • 941
  • 841
  • Using the symlink as you suggested, the script was able to find mypackage, but I've now run into another issue... See my edit above in the question. – tim_wonil Jul 23 '12 at 12:57
  • 7
    Instead of using "clever" symlinks with development, use a [virtualenv](http://pypi.python.org/pypi/virtualenv/) and run `python setup.py develop` to have distutils install the executable in your path (when you have activated that virtualenv) – SingleNegationElimination Jul 23 '12 at 13:01
  • Actually, I figured it out that other issue. – tim_wonil Jul 23 '12 at 13:41
  • @TokenMacGuy: virtualenv seems like a good solution, I'll have to learn more about it. – tim_wonil Jul 23 '12 at 13:42
  • 1
    @TokenMacGuy: I agree that virtualenv is the more flexible and robust solution. However, for simple situations (a single module or package), using a symlink always worked fine for me. The symlink is easier to set up and allows testing command-line tools without running `python setup.py`, so it also has its merits. – Sven Marnach Jul 23 '12 at 14:32
  • @SvenMarnach a single module or package? better to be prepared for the future ;p –  Jul 18 '13 at 20:28
0

I'm not sure if there is a "best choice", but the following is my normal practice:

  1. Put whatever script I wanna run in /bin

  2. do "python -m bin.script" in the dir myproject

  3. When importing in script.py, consider the dir in which script.py is sitting as root. So

    from ..mypackage import core
    

If the system supports symlink, it's a better choice.

Patrick the Cat
  • 2,138
  • 1
  • 16
  • 33
0

I usually add my bin path into $PYTHONPATH, that will enable python to look for asked module in bin directory too.

export PYTHONPATH=/home/username/bin:$PYTHONPATH
$ python
import module_from_bin
David Buck
  • 3,752
  • 35
  • 31
  • 35
Sufyan
  • 1
  • 1
0

I solved the issue following setuptools specifications.

In setup.py you can specify the modules as an argument for the function setup():

packages = find_packages() 

This finds all modules.

p.s. you have to import this function: from setuptools import setup, find_packages

aerijman
  • 2,522
  • 1
  • 22
  • 32