These kind of hierarchy definitions are a bit unusual in python projects, which is why you're having a hard time implementing it with the everyday-syntax. You should take a step back and think about how invested into this architecture you really are, and if it isn't too late to rewrite it in a way that adheres more closely to common python idioms, maybe you should do that instead ("explicit is better than implicit" in particular comes to mind).
That being said, if everyday python doesn't cut it, you can use strange python to write what you want without too much of a hassle. Consider reading up on the descriptor protocol if you want to understand how functions are turned into methods in detail.
MyFunPackage/worlds/__init__.py
from . import world1, world2
This line needs to be updated for any new world_n.py
file you create. While it can be automated to import dynamically, it will break any IDE's member hinting and requires even more shifty code. You did write that you don't want to change anything else when adding modules, but adding the name of the file to this line is hopefully ok.
This file should not contain any other code.
MyFunPackage/worlds/world*.py
def frobulate(self):
return f'{self.name} has been frobulated'
There is no need to add any special code to world1.py
, world2.py
, or any of the new files in the worlds
folder. Just write your functions in them as you see fit.
MyFunPackage/helloworlds.py
from types import MethodType, FunctionType, SimpleNamespace
from . import worlds
_BASE_ATTRIBUTES = {
'__builtins__', '__cached__', '__doc__', '__file__',
'__loader__', '__name__', '__package__', '__path__', '__spec__'
}
class Worlds:
def __init__(self, name):
self.name = name
# for all modules in the "worlds" package
for world_name in dir(worlds):
if world_name in _BASE_ATTRIBUTES:
continue # skip non-packages and
world = getattr(worlds, world_name)
function_map = {}
# collect all functions in them, by
for func in dir(world):
if not isinstance(getattr(world, func), FunctionType):
continue # ignoring non-functions, and
if getattr(world, func).__module__ != world.__name__:
continue # ignoring names that were only imported
# turn them into methods of the current worlds instance
function_map[func] = MethodType(getattr(world, func), self)
# and add them to a new namespace that is named after the module
setattr(self, world_name, SimpleNamespace(**function_map))
The module addition logic is completely dynamic and does not need to be updated in any way when you add new files to worlds
.
After setting it up as a package and installing it, trying your example code should work:
>>> from MyFunPackage.helloworld import Worlds
>>> x = Worlds('foo')
>>> x.world1.frobulate()
'foo has been frobulated'
Thanks python, for exposing your internal workings so deliberately.
Tangent: Dynamically adding functions to objects, patching vs describing
Using types.MethodType
to turn a function into a method configures said descriptor protocol on it, and passes ownership of the function to the owning instance. This is preferable to patching the instance into the signature due to a number of reasons.
I'll give an example real quick, because I think this is good to know. I'll skip the namespace here, since it doesn't change the behavior and would just make it a little harder to read:
class Foo:
"""An example class that does nothing yet."""
pass
def bar(self, text: str) -> str:
"""An example function, we will add this to an instance."""
return f"I am {self} and say {text}."
import inspect
import timeit
import types
# now the gang's all here!
Patching with a lambda
>>> foo = Foo()
>>> foo.bar = lambda *args, **kwargs: bar(foo, *args, **kwargs)
>>> foo.bar('baz')
'I am <__main__.Foo object at 0x000001FB890594E0> and say baz.'
# the behavior is as expected, but ...
>>> foo.bar.__doc__
None
# the doc string is gone
>>> foo.bar.__annotations__
{}
# the type annotations are gone
>>> inspect.signature(foo.bar)
<Signature (*args, **kwargs)>
# the parameters and their names are gone
>>> min(timeit.repeat(
... "foo.bar('baz')",
... "from __main__ import foo",
... number=100000)
... )
0.1211023000000182
# this is how long a single call takes
>>> foo.bar
<function <lambda> at 0x000001FB890594E0>
# as far as it is concerned, it's just some lambda function
In short, while the base functionality is reproduced, a lot of information is lost along the way. There is a good chance that this will become a problem down the road, whether because you want to properly document your work, want to use your IDE's type hinting, or have to go through stack traces during debugging and want to know which function exactly caused problems.
While it's completely fine to do something like this to patch out a dependency in a test suite, it's not something you should do in the core of your codebase.
Changing the descriptor
>>> foo = Foo()
>>> foo.bar = types.MethodType(foo, bar)
>>> foo.bar('baz')
'I am <__main__.Foo object at 0x00000292AE287D68> and say baz.'
# same so far, but ...
>>> foo.bar.__doc__
'An example function, we will add this to an instance.'
# the doc string is still there
>>> foo.bar.__annotations__
{'text': <class 'str'>, 'return': <class 'str'>}
# same as type annotations
>>> inspect.signature(foo.bar)
<Signature (text: str) -> str>
# and the signature is correct, without us needing to do anything
>>> min(timeit.repeat(
... "foo.bar('baz')",
... "from __main__ import foo",
... number=100000)
... )
0.08953189999999722
# execution time is 25% lower due to less overhead, no delegation necessary here
>>> foo.bar
<bound method bar of <__main__.Foo object at 0x00000292AE287D68>>
# and it knows that it's a method and belongs to an instance of Foo
Binding a function as a method in this way retains all information properly. As far as python is concerned, it is now the same as any other method that was bound statically and not dynamically.