1

I want to split a large python module i wrote into multiple files within a directory, where each file is a function that may or may not have dependencies with other functions within the module. Here's a simple example of what i came up with:

First, here's a self contained .py module

#[/pie.py]
def getpi():
    return pi()

def pi():
    return 3.1416

Obviously, this works fine when importing and calling either function. So now i split it in different files with an init.py file to wrap it all up:

#[/pie/__init__.py]
from getpi import *
from pi import *
__all__=['getpi','pi']

#[/pie/getpi.py]
def getpi():
    return pi()

#[/pie/pi.py]
def pi():
    return 3.1416

Because getpi() has a dependency with pi(), calling it as currently structured raises an exception:

>>> import pie
>>> pie.getpi()

Traceback (most recent call last):
  File "<pyshell#7>", line 1, in <module>
    pie.getpi()
  File "C:\python\pie\getpi.py", line 2, in getpi
    return pi()
NameError: global name 'pi' is not defined

And so to fix this issue, my current solution is to write init.py like so:

#[/pie/__init__.py]
import os as _os

__all__ = []
for _f in  _os.listdir(__path__[0]):
    if not _f == '__init__.py' and _f.endswith('.py'):
        execfile('%s\\%s'%(__path__[0],_f))
        __all__.append(_os.path.splitext(_f)[0])

So now it works fine:

>>> import pie
>>> pie.getpi()
3.1416

So now everything works as if everything was contained in a single .py file. init.py can contain all the high level imports (numpy, os, sys, glob...) that all the individual functions need.

Structuring a module this way feels "right" to me. New functions are loaded automatically at the next import (no need to append init.py each time). It lets me see at a glance which functions are meant to be used just by looking at what's within a directory, plus it keeps everything nicely sorted alphabetically.

The only negative i can see at this time is that only init.py gets byte-compiled and not any of the sub .py files. But loading speed hasn't been an issue so i don't mind. Also, i do realize this might cause an issue with packaging, but it's also something i don't mind because our scripts get distributed via our own revision control system.

Is this an acceptable way of structuring a python module? And if not, what would be the correct way to achieve what i've done properly.

Fnord
  • 5,365
  • 4
  • 31
  • 48

1 Answers1

4

The "correct" way would be to import the necessary modules where they are needed:

# pi.py
def pi(): return 3.1417

# getpi.py
from .pi import pi
def getpi(): return pi()

# __init__.py
from .pi import *
from .getpi import *

Make sure you don't have cyclic dependencies. These are bad in any case, but you can avoid them by abstracting up to the necessary level.

Community
  • 1
  • 1
Niklas B.
  • 92,950
  • 18
  • 194
  • 224
  • To keep the dynamic load nature of my __init__.py, is it ok to replace `execfile('%s\\%s'%(__path__[0],_f))` with `exec 'from .%s import *'%_os.path.splitext(_f)[0]` ? – Fnord Mar 25 '14 at 22:50
  • @Fnord I would rather use [`importlib`](http://docs.python.org/dev/library/importlib.html) for that – Niklas B. Mar 25 '14 at 22:56