2

I have a file structure like this:

project/
    __init__.py
    common/
        __init__.py
        util.py
        utilhelper.py
    stuff/
        __init__.py
        coolstuff/
            __init__.py
            awesome.py

common/init.py

import util, utilhelper

common/util.py

import utilhelper
help = utilhelper.help()

stuff/coolstuff/awesome.py

from common import util
print util.help

When I do:

python project/stuff/coolstuff/awesome.py

"from common import util" fails with "no module named common".

I realize that I am missing some seriously important mental concepts like paths and packaging, because I just have no idea how to solve this. But if possible, I want to keep code like "from common import util" in deep subdirectories.

I've considered: -setting up Paver to inject path dependencies into tasks that run python scripts via sh:

@task
@needs(['common'])
def dostuff():
    sh('python stuff/coolstuff/awesome.py')

Unfortunately I have no idea what I'm doing and cannot find any good examples/tutorials on this.

-using imp to explicitly import relative and/or absolute file paths in 100% of my scripts

-writing little hacky path inserts at the top of every python file

I'd really appreciate any advice.

  • 1
    See http://stackoverflow.com/questions/11536764/attempted-relative-import-in-non-package-even-with-init-py/ and http://stackoverflow.com/questions/14132789/python-relative-imports-for-the-billionth-time and read up on relative imports. – BrenBarn Oct 04 '13 at 19:21

1 Answers1

0

As a quick fix, I:

  • moved the common directory to within stuff/
  • used relative imports in awesome.py "from ..common import util"
  • used sh in a taskrunner file to explicitly hard code the -m (module mode) python script launchers
  • As an even better fix, I added my project's directory to my PYTHONPATH. It required far less code to implement, at the cost of having to set environment variables on every machine that ran the code – Brian Weidenbaum Oct 09 '13 at 06:06