87

This article has a snippet showing usage of __bases__ to dynamically change the inheritance hierarchy of some Python code, by adding a class to an existing classes collection of classes from which it inherits. Ok, that's hard to read, code is probably clearer:

class Friendly:
    def hello(self):
        print 'Hello'

class Person: pass

p = Person()
Person.__bases__ = (Friendly,)
p.hello()  # prints "Hello"

That is, Person doesn't inherit from Friendly at the source level, but rather this inheritance relation is added dynamically at runtime by modification of the __bases__attribute of the Person class. However, if you change Friendly and Person to be new style classes (by inheriting from object), you get the following error:

TypeError: __bases__ assignment: 'Friendly' deallocator differs from 'object'

A bit of Googling on this seems to indicate some incompatibilities between new-style and old style classes in regards to changing the inheritance hierarchy at runtime. Specifically: "New-style class objects don't support assignment to their bases attribute".

My question, is it possible to make the above Friendly/Person example work using new-style classes in Python 2.7+, possibly by use of the __mro__ attribute?

Disclaimer: I fully realise that this is obscure code. I fully realize that in real production code tricks like this tend to border on unreadable, this is purely a thought experiment, and for funzies to learn something about how Python deals with issues related to multiple inheritance.

Alexey
  • 3,843
  • 6
  • 30
  • 44
Adam Parkin
  • 17,891
  • 17
  • 66
  • 87
  • It's also nice for learners to read this if they are not familiar with metaclass, type(), ...: http://www.slideshare.net/gwiener/metaclasses-in-python :) – hkoosha Jan 30 '14 at 11:30
  • Here's my use case. I'm importing a library that has class B inheriting from class A. – FutureNerd Jun 25 '14 at 02:39
  • 3
    Here's my actual use case. I'm importing a library that has class B inheriting from class A. I want to create New_A inheriting from A, with new_A_method(). Now I want to create New_B inheriting from... well, from B as if B inherited from New_A, so that B's methods, A's methods, and new_A_method() are all available to instances of New_B. How can I do this without monkey-patching the existing class A? – FutureNerd Jun 25 '14 at 02:49
  • Couldn't you have ```New_B``` inherit from both ```B``` and ```New_A```? Remember Python supports multiple inheritance. – Adam Parkin Jun 25 '14 at 16:28
  • 2
    After a bit of googling, the following python bug report seemed relevant... http://bugs.python.org/issue672115 – mgilson Jan 22 '15 at 07:44
  • related: https://stackoverflow.com/questions/29413046/python-how-to-rebase-or-dynamically-replace-a-class-with-a-different-base-class – Jonathan Komar Jan 03 '22 at 11:49

7 Answers7

44

Ok, again, this is not something you should normally do, this is for informational purposes only.

Where Python looks for a method on an instance object is determined by the __mro__ attribute of the class which defines that object (the M ethod R esolution O rder attribute). Thus, if we could modify the __mro__ of Person, we'd get the desired behaviour. Something like:

setattr(Person, '__mro__', (Person, Friendly, object))

The problem is that __mro__ is a readonly attribute, and thus setattr won't work. Maybe if you're a Python guru there's a way around that, but clearly I fall short of guru status as I cannot think of one.

A possible workaround is to simply redefine the class:

def modify_Person_to_be_friendly():
    # so that we're modifying the global identifier 'Person'
    global Person

    # now just redefine the class using type(), specifying that the new
    # class should inherit from Friendly and have all attributes from
    # our old Person class
    Person = type('Person', (Friendly,), dict(Person.__dict__)) 

def main():
    modify_Person_to_be_friendly()
    p = Person()
    p.hello()  # works!

What this doesn't do is modify any previously created Person instances to have the hello() method. For example (just modifying main()):

def main():
    oldperson = Person()
    ModifyPersonToBeFriendly()
    p = Person()
    p.hello()  
    # works!  But:
    oldperson.hello()
    # does not

If the details of the type call aren't clear, then read e-satis' excellent answer on 'What is a metaclass in Python?'.

Community
  • 1
  • 1
Adam Parkin
  • 17,891
  • 17
  • 66
  • 87
  • 1
    -1: why to force the new class to inherit from Frielndly only when you could just as well preserve the original ` __mro__` by calling: `type('Person', (Friendly) + Person.__mro__, dict(Person.__dict__)) ` (and better yet, add safeguards so that Friendly don't end twice in there. ) there are other issues here - as for where the "Person" class is actually defined and used: your function only change it on the current module - other modules running Person will be unafectd - you'd better perform a monkeypatch on the module Person is defined. (and even there there are issues) – jsbueno May 22 '15 at 21:23
  • 13
    `-1` You've totally missed the reason for the exception in the first place. You can happily modify `Person.__class__.__bases__` in Python 2 and 3, so long as `Person` doesn't inherit from `object` **directly**. See akaRem and Sam Gulve's answers below. This workaround is only working around your own misunderstanding of the problem. – Carl Smith May 26 '15 at 23:29
  • 3
    This is a very buggy "solution". Among the other problems with this code, [it breaks the `__dict__` attribute of resulting objects](https://ideone.com/6OyEcw). – user2357112 Feb 18 '19 at 19:02
28

I've been struggling with this too, and was intrigued by your solution, but Python 3 takes it away from us:

AttributeError: attribute '__dict__' of 'type' objects is not writable

I actually have a legitimate need for a decorator that replaces the (single) superclass of the decorated class. It would require too lengthy a description to include here (I tried, but couldn't get it to a reasonably length and limited complexity -- it came up in the context of the use by many Python applications of an Python-based enterprise server where different applications needed slightly different variations of some of the code.)

The discussion on this page and others like it provided hints that the problem of assigning to __bases__ only occurs for classes with no superclass defined (i.e., whose only superclass is object). I was able to solve this problem (for both Python 2.7 and 3.2) by defining the classes whose superclass I needed to replace as being subclasses of a trivial class:

## T is used so that the other classes are not direct subclasses of object,
## since classes whose base is object don't allow assignment to their __bases__ attribute.

class T: pass

class A(T):
    def __init__(self):
        print('Creating instance of {}'.format(self.__class__.__name__))

## ordinary inheritance
class B(A): pass

## dynamically specified inheritance
class C(T): pass

A()                 # -> Creating instance of A
B()                 # -> Creating instance of B
C.__bases__ = (A,)
C()                 # -> Creating instance of C

## attempt at dynamically specified inheritance starting with a direct subclass
## of object doesn't work
class D: pass

D.__bases__ = (A,)
D()

## Result is:
##     TypeError: __bases__ assignment: 'A' deallocator differs from 'object'
martineau
  • 119,623
  • 25
  • 170
  • 301
Mitchell Model
  • 1,060
  • 10
  • 16
6

I can not vouch for the consequences, but that this code does what you want at py2.7.2.

class Friendly(object):
    def hello(self):
        print 'Hello'

class Person(object): pass

# we can't change the original classes, so we replace them
class newFriendly: pass
newFriendly.__dict__ = dict(Friendly.__dict__)
Friendly = newFriendly
class newPerson: pass
newPerson.__dict__ = dict(Person.__dict__)
Person = newPerson

p = Person()
Person.__bases__ = (Friendly,)
p.hello()  # prints "Hello"

We know that this is possible. Cool. But we'll never use it!

akaRem
  • 7,326
  • 4
  • 29
  • 43
3

Right of the bat, all the caveats of messing with class hierarchy dynamically are in effect.

But if it has to be done then, apparently, there is a hack that get's around the "deallocator differs from 'object" issue when modifying the __bases__ attribute for the new style classes.

You can define a class object

class Object(object): pass

Which derives a class from the built-in metaclass type. That's it, now your new style classes can modify the __bases__ without any problem.

In my tests this actually worked very well as all existing (before changing the inheritance) instances of it and its derived classes felt the effect of the change including their mro getting updated.

Ryne Everett
  • 6,427
  • 3
  • 37
  • 49
Sam Gulve
  • 39
  • 1
2

I needed a solution for this which:

  • Works with both Python 2 (>= 2.7) and Python 3 (>= 3.2).
  • Lets the class bases be changed after dynamically importing a dependency.
  • Lets the class bases be changed from unit test code.
  • Works with types that have a custom metaclass.
  • Still allows unittest.mock.patch to function as expected.

Here's what I came up with:

def ensure_class_bases_begin_with(namespace, class_name, base_class):
    """ Ensure the named class's bases start with the base class.

        :param namespace: The namespace containing the class name.
        :param class_name: The name of the class to alter.
        :param base_class: The type to be the first base class for the
            newly created type.
        :return: ``None``.

        Call this function after ensuring `base_class` is
        available, before using the class named by `class_name`.

        """
    existing_class = namespace[class_name]
    assert isinstance(existing_class, type)

    bases = list(existing_class.__bases__)
    if base_class is bases[0]:
        # Already bound to a type with the right bases.
        return
    bases.insert(0, base_class)

    new_class_namespace = existing_class.__dict__.copy()
    # Type creation will assign the correct ‘__dict__’ attribute.
    del new_class_namespace['__dict__']

    metaclass = existing_class.__metaclass__
    new_class = metaclass(class_name, tuple(bases), new_class_namespace)

    namespace[class_name] = new_class

Used like this within the application:

# foo.py

# Type `Bar` is not available at first, so can't inherit from it yet.
class Foo(object):
    __metaclass__ = type

    def __init__(self):
        self.frob = "spam"

    def __unicode__(self): return "Foo"

# … later …
import bar
ensure_class_bases_begin_with(
        namespace=globals(),
        class_name=str('Foo'),   # `str` type differs on Python 2 vs. 3.
        base_class=bar.Bar)

Use like this from within unit test code:

# test_foo.py

""" Unit test for `foo` module. """

import unittest
import mock

import foo
import bar

ensure_class_bases_begin_with(
        namespace=foo.__dict__,
        class_name=str('Foo'),   # `str` type differs on Python 2 vs. 3.
        base_class=bar.Bar)


class Foo_TestCase(unittest.TestCase):
    """ Test cases for `Foo` class. """

    def setUp(self):
        patcher_unicode = mock.patch.object(
                foo.Foo, '__unicode__')
        patcher_unicode.start()
        self.addCleanup(patcher_unicode.stop)

        self.test_instance = foo.Foo()

        patcher_frob = mock.patch.object(
                self.test_instance, 'frob')
        patcher_frob.start()
        self.addCleanup(patcher_frob.stop)

    def test_instantiate(self):
        """ Should create an instance of `Foo`. """
        instance = foo.Foo()
bignose
  • 30,281
  • 14
  • 77
  • 110
1

The above answers are good if you need to change an existing class at runtime. However, if you are just looking to create a new class that inherits by some other class, there is a much cleaner solution. I got this idea from https://stackoverflow.com/a/21060094/3533440, but I think the example below better illustrates a legitimate use case.

def make_default(Map, default_default=None):
    """Returns a class which behaves identically to the given
    Map class, except it gives a default value for unknown keys."""
    class DefaultMap(Map):
        def __init__(self, default=default_default, **kwargs):
            self._default = default
            super().__init__(**kwargs)

        def __missing__(self, key):
            return self._default

    return DefaultMap

DefaultDict = make_default(dict, default_default='wug')

d = DefaultDict(a=1, b=2)
assert d['a'] is 1
assert d['b'] is 2
assert d['c'] is 'wug'

Correct me if I'm wrong, but this strategy seems very readable to me, and I would use it in production code. This is very similar to functors in OCaml.

Community
  • 1
  • 1
fredcallaway
  • 1,438
  • 12
  • 7
  • 2
    Not really sure what this has to do with the question as the question really was about dynamically changing base classes at runtime. In any case, what's the advantage of this over just inheriting from ```Map``` directly and overriding methods as needed (much like what the standard ```collections.defaultdict``` does)? As it stands ```make_default``` can only ever return one type of thing, so why not just make ```DefaultMap``` the top level identifier rather than having to call ```make_default``` to get the class to instantiate? – Adam Parkin Jan 05 '16 at 16:18
  • Thanks for the feedback! (1) I landed here while trying to do dynamic inheritance, so I figured I could help someone who followed my same path. (2) I believe that one could use `make_default` to create a default version of some other type of dictionary-like class (`Map`). You can't inherit from `Map` directly because `Map` is not defined until runtime. In this case, you would want to inherit from `dict` directly, but we imagine that there could be a case where you wouldn't know what dictionary-like class to inherit from until runtime. – fredcallaway Jan 05 '16 at 18:07
1

This method isn't technically inheriting during runtime, since __mro__ can't be changed. But what I'm doing here is using __getattr__ to be able to access any attributes or methods from a certain class. (Read comments in order of numbers placed before the comments, it makes more sense)

class Sub:
    def __init__(self, f, cls):
        self.f = f
        self.cls = cls

    # 6) this method will pass the self parameter
    # (which is the original class object we passed)
    # and then it will fill in the rest of the arguments
    # using *args and **kwargs
    
    def __call__(self, *args, **kwargs):
        # 7) the multiple try / except statements
        # are for making sure if an attribute was
        # accessed instead of a function, the __call__
        # method will just return the attribute

        try:
            return self.f(self.cls, *args, **kwargs)
        except TypeError:
            try:
                return self.f(*args, **kwargs)
            except TypeError:
                return self.f

# 1) our base class
class S:
    def __init__(self, func):
        self.cls = func

    def __getattr__(self, item):
        # 5) we are wrapping the attribute we get in the Sub class
        # so we can implement the __call__ method there
        # to be able to pass the parameters in the correct order

        return Sub(getattr(self.cls, item), self.cls)



# 2) class we want to inherit from
class L:
    def run(self, s):
        print("run" + s)

# 3) we create an instance of our base class
# and then pass an instance (or just the class object)
# as a parameter to this instance

s = S(L) # 4) in this case, I'm using the class object

s.run("1")

So this sort of substitution and redirection will simulate the inheritance of the class we wanted to inherit from. And it even works with attributes or methods that don't take any parameters.