I am using Python to work in Maya and over the months I have started to build a lot of different modules that shape various tools.
Initially I was simply stacking everything inside one single root package, composed of huge modules containing a bit of functions, classes etc depending on the situation, with a separate "core" module containing all the generic stuff.
root (package)
animation.py (module)
audio.py (module)
modelling.py (module)
rigging.py (module)
library.py (module)
core.py (module)
Recently I have seen some examples of a different approach, which consist in building packages for each macro area and splitting the code inside them into various smaller modules. Something similar to this.
root (package)
animation (package)
__init__.py (module)
core.py (module)
utils.py (module)
ui.py (module
bar.py (module)
audio (package)
__init__.py (module)
core.py (module)
utils.py (module)
ui.py (module
foo.py (module)
rigging (package)
__init__.py (module)
core.py (module)
utils.py (module)
ui.py (module
foo.py (module)
etc
Inside the __init__.py modules the other modules are imported this way.
from core import *
from utils import *
from ui import Window
from foo import Foo
So that when the whole package is imported the result is quite similar to just having everything inside one huge module that contains everything. At the end, to access the functionalities of each module/package this code should work for both the structures.
import animation
animation.Foo()
My question is:
Which of these methods is more logic to organise code? On one way the "package" structure looks better organised, but it's giving me more cyclic imports and reload problems (that may just be a sign of my bad import habits). On the other, the classic modules are starting to become quite hard to navigate because of the amount of stuff I put in them.