19

In both Python 2 and Python 3 the code:

class Foo(object):
    pass

f = Foo()
f.__call__ = lambda *args : args

f(1, 2, 3)

returns as error Foo object is not callable. Why does that happen?

PS: With old-style classes it works as expected.

PPS: This behavior is intended (see accepted answer). As a work-around it's possible to define a __call__ at class level that just forwards to another member and set this "normal" member to a per-instance __call__ implementation.

Python 3.10.8 (main, Nov  1 2022, 14:18:21) [GCC 12.2.0]
Type 'copyright', 'credits' or 'license' for more information
IPython 8.7.0 -- An enhanced Interactive Python. Type '?' for help.

In [1]: class Foo:
   ...:     def __call__(self, *args, **kwargs):
   ...:         return self.call(*args, **kwargs)
   ...: 

In [2]: f = Foo()

In [3]: f.call = lambda *args, **kwargs: (args, kwargs)

In [4]: f(1,2,3)
Out[4]: ((1, 2, 3), {})
6502
  • 112,025
  • 15
  • 165
  • 265

2 Answers2

24

Double-underscore methods are always looked up on the class, never the instance. See Special method lookup for new-style classes:

For new-style classes, implicit invocations of special methods are only guaranteed to work correctly if defined on an object’s type, not in the object’s instance dictionary.

That's because the type might need to support the same operation (in which case the special method is looked up on the metatype).

For example, classes are callable (that's how you produce an instance), but if Python looked up the __call__ method on the actual object, then you could never do so on classes that implement __call__ for their instances. ClassObject() would become ClassObject.__call__() which would fail because the self parameter is not passed in to the unbound method. So instead type(ClassObject).__call__(ClassObject) is used, and calling instance() translates to type(instance).__call__(instance).

To work around this, you could add a __call__ method to the class which checks for a __call__ attribute on the class, and if there, calls it.

Martijn Pieters
  • 1,048,767
  • 296
  • 4,058
  • 3,343
  • To define a per-instance `__call__` implementation I'm currently using a class-level `__call__` that forwards to a regular member. Seems an ugly solution however... – 6502 Nov 20 '15 at 10:53
  • 1
    @6502: that's the recommended work-around. If you really need to delegate to a per-instance callable, then doing so explicitly in a class-level `__call__` is the best way to do so. – Martijn Pieters Nov 20 '15 at 10:55
  • @MartijnPieters What would be a work around to forward all operators (`__add__`, `__rtruediv__`, etc) to an integer object store inside the proxy? I'd like to not implement all methods manually if possible. – danijar Aug 28 '16 at 02:04
  • @danijar: you could use a metaclass to generate the methods from a list of names perhaps. – Martijn Pieters Aug 28 '16 at 08:22
4

In new style classes (default 3.x) and inherited from object in 2.x the attribute interception methods __getattr__ and __getattribute__ are no longer called for built-in operations on dunder overloaded methods of instances and instead the search begins at the class.

The rationale behind this, involves a technicallity introduced by the presence of MetaClasses.

Because classes are instances of metaclasses and because metaclasses might define certain operations that act on classes, skipping the class (which in this case can be considered an instance) makes sense; you need to invoke the method defined in the metaclass which is defined to process classes (cls as a first argument). If instance look-up was used, the look-up would use the class method which is defined for instances of that class (self).

Another (disputed reason) involves optimization: Since built-in operations on instances are usually invoked very frequently, skipping the instance look-up altogether and going directly to the class saves us some time. [Source Lutz, Learning Python, 5th Edition]


The main area where this might be of inconvenience is when creating proxy objects that overload __getattr__ and __getattribute__ with intent to forward the calls to the embedded internal object. Since built-in invocation will skip the instance altogether they won't get caught and as an effect won't get forwarded to the embedded object.

An easy, but tedious, work-around for this is to actually overload all the dunders you wish to intercept in the proxy object. Then in these overloaded dunders you can delegate the call to the internal object as required.


The easiest work-around I can think of is setting the attribute on the class Foo with setattr:

setattr(Foo, '__call__', lambda *args: print(args))

f(1, 2, 3)
(<__main__.Foo object at 0x7f40640faa90>, 1, 2, 3)
Dimitris Fasarakis Hilliard
  • 150,925
  • 31
  • 268
  • 253
  • You are confusing matters here. Python bypasses `__getattribute__` in some cases as an optimisation. But looking up special methods on the class is not an optimisation choice; it is needed to properly support special methods acting *on the type*. Compare `hash(int)` with `hash(1)`, for example, both need to be supported. – Martijn Pieters Nov 20 '15 at 10:36
  • I added that as a second reason; but couldn't this *also* be considered for optimization since in cases where we have **many** instances of a given class we skip the `__dict__` look-up in the new model? – Dimitris Fasarakis Hilliard Nov 20 '15 at 10:39
  • That lookup is cheap, and when calling a special method on an instance, *there is just one instance*. – Martijn Pieters Nov 20 '15 at 10:40
  • For more info see the documentation I linked to in my answer. You presented the optimisation as the first reason. – Martijn Pieters Nov 20 '15 at 10:40
  • I switched the ordering and added a nice disputed to it (and a source of where I found it). Might test it out and come back asking questions. Thanks for the insights. – Dimitris Fasarakis Hilliard Nov 20 '15 at 10:52
  • @MartijnPieters I don't understand why these methods have to act on the type as well. Why do both `hash(int)` and `hash(1)` have to be supported? If the programmer attaches the method after instantiation, isn't it fair to assume that they were fully aware that `hash(type(foo))` wouldn't work? Otherwise, they could've simply attached the method to the type itself, consciously... what am I missing? – Nearoo Jun 15 '17 at 09:48
  • 2
    @Nearoo: how else would you be able to use types as dictionary keys? `dispatch = {int: function_to_handle_integers, float: function_to_handle_floats}`. If `__hash__` was looked up on the instance, then you'd get `int.__hash__()` being called and failing because it's not bound to an actual integer. Instead `type.__hash__(int)` is called, and for actual integers like `1`, `int.__hash__(1)` is called. In other words, this matters because types (classes) are objects *too*, they are instances of `type`. – Martijn Pieters Jun 15 '17 at 12:41