0

I'm in scenario where I want to refactor several classes which have identical and/or similar methods. The number of class are around ~20 and the number of similar methods are around ~15. All sorts of combinations exist within this space, which is why I'm a bit reluctant to using inheritance to solve this issue (rightfully?).

The code is part of a wrapper around another application that is controlled by a com api. The wrapper in turn is part of a package that is distributed internally at the company where I work. Therefore the interfaces of the classes have to remain the same (for backwards compatibility).

This example illustrates some very simplified versions of the classes:

class FirstCollectionLike:
    def __init__(self):
        self._collection = list()

    def add(self, arg):
        self._collection.append(arg)

    def remove(self, index):
        del self._collection[index]


class SecondCollectionLike:
    def __init__(self):
        self._collection = list()
        self._resource = some_module.get_resource()

    def start(self):
        some_module.start(self.resource)

    def add(self, arg):
        self._collection.append(arg)

    def remove(self, value):
        self._collection.remove(value)


class SomeOtherClass:
    def __init__(self):
        self._some_attribute = 0
        self._resource = some_module.get_resource()

    def add(self, value):
        self._some_attribute += value

    def start(self):
        some_module.start(self._resource)

Are there any design patterns I could look into that would help me solve this issue?

My initial thought was to create method classes like Add, RemoveByIndex and RemoveByName that implements __call__ like so:

class Add:
    def __init__(self, owner):
        self.owner = owner

    def __call__(self, item):
        self._collection.append(item)

class AddAndInstantiate:
    def __init__(self, owner, type_to_instantiate):
        self.owner = owner
        self.type_to_instantiate = type_to_instantiate

    def __call__(self, name):
        self._collection.append(type_to_instantiate(name))

and then assign instances of those classes as instance attributes to their respective owner objects:

class RefactoredClassOne:
    def __init__(self):
        self.add = Add(self)
        self.remove = RemoveByIndex(self)

class RefactoredClassTwo:
    def __init__(self):
        self.add = AddAndInstantiate(self, SomeClass)
        self.remove = RemoveByName(self)

This way I could quite easily add any method I want to a class and provide some arguments to the method class if needed (like the type of the class to instantiate in the example above). The downside is that it is a bit harder to follow what is happening, and the automatic documentation generation we use (sphinx) does not work if the methods are implemented in this way.

Does this seem like a bad approach? What are the alternatives?

Izen
  • 21
  • 3
  • 1
    Looking into [composition](https://en.wikipedia.org/wiki/Composition_over_inheritance) might help you – Er... Dec 04 '19 at 15:26
  • Would you consider the solution I proposed a form of composition? Is there anything you would change in that regard? – Izen Dec 05 '19 at 07:09

1 Answers1

1

First, if your classes are as simple as you example suggest, I'm not sure OOP is the right tool. What your classes are doing is just renaming a couple of basic calls. This is useless abstraction and IMO a bad practice (why force me to look to into the SecondClassCollectionLike.py file to discover that .add() is 1) in fact a wrongly named append and 2) that my collection is in fact a listwith a fancy name?)

In that case I'd say that a functional approach might be better, and a workflow such as:

a = SecondClassCollectionLike()
a.add("x")
a.add("y")
a.remove(0)
a.start()

would be a lot clearer if it looked like

a = list()
a.append("x")
a.append(y)
del a[0]
somemodule.start()

If your classes are in fact more complex and you really want to keep the OOP approach, I think that this solution is probably close to your solution and what you're looking for.

The idea is to have modules which hold the logic. For example a _collection_behaviour.py module, which holds the add(), remove(), increment() or whatever. And a _runtime.py module, which holds that start(), stop(), etc. logic.

This way you could have classes which exibit behaviour from these modules:

calss MyClass():
    def __init__(self):
        self._collection = list()

    from ._collection_behaviour import add
    from ._collection_behaviour import remove
    from ._runtime import start

But I do not see the point in making these functions classes which implement __call__ if that's all they do.

Er...
  • 526
  • 4
  • 10
  • Yes, the real classes are a bit more complex than this. I will update my question to clarify some things. But to answer your post, the reason for using the function classes is to store some states that they require. Two examples are: which object it is connected to (which is required in order to add to self._collection in my example) and a type that should be instantiated when add is called. I think your solution is good, but it would not work if I need to specify the type as well. Do you have any idea regarding how I could solve that? And again, I will try to improve my problem description. – Izen Dec 05 '19 at 13:10
  • Big plus is that sphinx will add the documentation of the imported methods to the owner class! – Izen Dec 06 '19 at 07:41