2

I'm simulating a distributed system in which all nodes follow some protocol. This includes assessing some small variations in the protocol. A variation means alternative implementation of a single method. All nodes always follow the same variation, which is determined by experiment configuration (only one configuration is active at any given time). What is the clearest way to do it, without sacrificing performance?

As an experiment can be quite extensive, I clearly don't want any conditionals. Before I've just used inheritance, like:

class Node(object):

    def dumb_method(self, argument):
        # ...    

    def slow_method(self, argument):
        # ...

    # A lot more methods

class SmarterNode(Node):

    def dumb_method(self, argument):
        # A somewhat smarter variant ...

class FasterNode(SmarterNode):

    def slow_method(self, argument):
        # A faster variant ...

But now I need to test all possible variants and don't want an exponential number of classes cluttering the source. I also want other people peeping at the code to understand it with minimal effort. What are your suggestions?

Edit: One thing I failed to emphasize enough: for all envisioned use cases, it seems that patching the class upon configuration is good. I mean: it can be made to work by simple Node.dumb_method = smart_method. But somehow it didn't feel right. Would this kind of solution cause major headache to a random smart reader?

lRem
  • 139
  • 9
  • Maintain a list of methods? As far as I can see, there's nothing that demands dedicated classes. Take a look at the strategy-pattern. – phant0m Jul 13 '11 at 14:03
  • Yes, this is more or less what *strategy* is for. But here it's more constrained in the way that at any time, each client will use exactly the same strategy. So an object encapsulating the algorithm feels redundant and I want to avoid waste of resources. – lRem Jul 13 '11 at 14:30
  • But you already have an object "encapsulating" the algorithm: The class. Keep in mind that each class is an instance of an object (it's metaclass, by default `type`) – phant0m Jul 13 '11 at 14:34
  • Hmmm, maybe this would actually be the clearest idea. At least it's already entrenched in the culture. – lRem Jul 13 '11 at 14:47

5 Answers5

2

You could use the __slots__ mechanism and a factory class. You would need to instantiate a NodeFactory for each experiment, but it would handle creating Node instances for you from there on. Example:

class Node(object):
    __slots__ = ["slow","dumb"]

class NodeFactory(object):
     def __init__(self, slow_method, dumb_method):
         self.slow = slow_method
         self.dumb = dumb_method
     def makenode(self):
         n = Node()
         n.dumb = self.dumb
         n.slow = self.slow
         return n

an example run:

>>> def foo():
...     print "foo"
...
>>> def bar():
...     print "bar"
...
>>> nf = NodeFactory(foo, bar)
>>> n = nf.makenode()
>>> n.dumb()
bar
>>> n.slow()
foo
multipleinterfaces
  • 8,913
  • 4
  • 30
  • 34
  • Thanks for mentioning `__slots__`, a nice mechanism. But still assigning to the class directly is sufficient in this case and even more efficient. – lRem Jul 13 '11 at 14:39
2

You can create new subtypes with the type function. You just have to give it the subclasses namespace as a dict.

# these are supposed to overwrite methods
def foo(self):
    return "foo"

def bar(self):
    return "bar"


def variants(base, methods):
    """
        given a base class and list of dicts like [{ foo = <function foo> }] 
         returns types T(base) where foo was overwritten
    """
    for d in methods:
        yield type('NodeVariant', (base,), d)


from itertools import combinations
def subdicts(**fulldict):
    """ returns all dicts that are subsets of `fulldict` """
    items = fulldict.items()
    for i in range(len(items)+1):
        for subset in combinations(items, i):
            yield dict(subset)

# a list of method variants
combos = subdicts(slow_method=foo, dumb_method=bar)

# base class
class Node(object):
        def dumb_method(self):
            return "dumb"
        def slow_method(self):
            return "slow"

# use the base and our variants to make a number of types
types = variants(Node, combos)

# instantiate each type and call boths methods on it for demonstration
print [(var.dumb_method(), var.slow_method()) for var
          in (cls() for cls in types)]

# [('dumb', 'slow'), ('dumb', 'foo'), ('bar', 'slow'), ('bar', 'foo')]
Jochen Ritzel
  • 104,512
  • 31
  • 200
  • 194
0

I'm not sure if you're trying to do something akin to this (allows swap-out runtime "inheritance"):

class Node(object):
    __methnames = ('method','method1')    

    def __init__(self, type):
        for i in self.__methnames:
            setattr(self, i, getattr(self, i+"_"+type))

    def dumb_method(self, argument):
        # ...    

    def slow_method(self, argument):
        # ...

n = Node('dumb')
n.method() # calls dumb_method

n = Node('slow')
n.method() # calls slow_method

Or if you're looking for something like this (allows running (and therefore testing) of all methods of the class):

class Node(object):
    #do something

class NodeTest(Node):

    def run_tests(self, ending = ''):
        for i in dir(self):
            if(i.endswith(ending)):
                meth = getattr(self, i)
                if(callable(meth)):
                    meth() #needs some default args.
                    # or yield meth if you can
cwallenpoole
  • 79,954
  • 26
  • 128
  • 166
  • More like the first one. But all the *variants* are independent, so this would need one `__init__` argument for each of the possible variations. One option that I was considering: keep Node a simple one with a set of default variants. Upon configuration, patch it with `Node.dumb_method = smart_method`. But then, where should `smart_method` live in? Be a function with `self` as argument? Be in it's own `class` that will never be instantiated? – lRem Jul 13 '11 at 14:25
  • @IRem: give `Node` all the methods (+ one just called `method` or something) and do `Node.method = None.smart_method`? – Steven Jul 13 '11 at 15:30
0

You can use a metaclass for this.

If will let you create a class on the fly with method names according to every variations.

Community
  • 1
  • 1
Bite code
  • 578,959
  • 113
  • 301
  • 329
  • This is probably the best description of metaclasses I've seen. I wish I could upvote ;) I've made a minor correction to it: please peer review. – lRem Jul 13 '11 at 17:22
  • Thanks. There are actually other typos, but I'm too lazy to fix them now :-) – Bite code Jul 13 '11 at 23:10
0

Should the method to be called be decided when the class is instantiated or after? Assuming it is when the class is instantiated, how about the following:

class Node():
    def Fast(self):
        print "Fast"

    def Slow(self):
        print "Slow"

class NodeFactory():
    def __init__(self, method):
        self.method = method

    def SetMethod(self, method):
        self.method = method

    def New(self):
        n = Node()
        n.Run = getattr(n, self.method)
        return n

nf = NodeFactory("Fast")

nf.New().Run()
# Prints "Fast"

nf.SetMethod("Slow")

nf.New().Run()
# Prints "Slow"
combatdave
  • 765
  • 7
  • 17
  • As I've added to the question: I can even go further, up to patching whole class, not only the object. The method to be called is determined at configuration, before instantiation of any Node, and does not change as long as the Nodes are needed. – lRem Jul 13 '11 at 15:04
  • Does my answer not do what you need to do, then? Unless you specifically need to patch the whole class, which I suspect you don't. – combatdave Jul 13 '11 at 15:13
  • It does, but when I think of it, then patching the class is more beneficial. First, it doesn't introduce the need for a factory. Second, in this particular case, patching the class ensures the *All nodes always follow the same variation* part. – lRem Jul 13 '11 at 15:41