5

TDD is all the rage these days and an ever growing number of software shops are converting to agile,scrum, etc. I can certainly see advantages of automated testing but I also see TDD as contradicting some principles of good object oriented design.

TDD requires you to insert seams in your code which expose implementation details through the interface. Dependency injection or collaborator injection violates the principle of information hiding. If your class uses collaborator classes then the construction of these collaborators should be internal to the class and not exposed through the constructor or interface.

I haven't seen any literature addressing the conflicts between writing testable code and at the same time adhering to the principles of encapsulation, simplicity and information hiding. Have these problems been addressed in any standard way?

Assaf Stone
  • 6,309
  • 1
  • 34
  • 43
Julian
  • 59
  • 2
  • 2
    Why have you linked this question with Java? To me this question sounds completely language agnostic. – quamrana May 07 '11 at 09:27
  • I tagged it Java because I mentioned some more or less Java specific things like dependency injection. Some dynamic languages allow for ways to test code without inserting seams into it and do not have the problems I brought up. – Julian May 08 '11 at 04:07
  • 2
    These constructs (e.g. dependency injection) are common to **all** statically typed languages - Java, C#, etc. – Assaf Stone May 08 '11 at 04:20

6 Answers6

10

The methods and classes you think are implementation details are really seams that represent axes along which you can vary and recompose components into new constellations.

The classic idea about encapsulation tends to be too coarse-grained because when you hide a lot of moving parts you also make the code very inflexible. It also tends to violate a lot of the SOLID principles - most prominently the Single Responsibility Principle.

Most object-oriented design patterns tend to be rather fine-grained and fit well into the SOLID principles, which is another indicator that proper object-oriented design is fine-grained.

If your class uses collaborator classes then the construction of these collaborators should be internal to the class and not exposed through the constructor or interface.

These are two different things mixed together. I agree that in most cases the collaborators should not be exposed through the interfaces. However, exposing them through the constructors are the correct thing to do. Constructors are essentially implementation details of a class while the interfaces provide the real API.

If you want to preserve a coarse-grained API to target default functionality you can still do so by supplying a Facade. See this post for more details: Dependency Inject (DI) "friendly" library

Community
  • 1
  • 1
Mark Seemann
  • 225,310
  • 48
  • 427
  • 736
  • Marks, thanks for excellent response. While I don't necessarily disagree I'm ambivalent about some of the things you've mentioned. It's true that a constructor is part of implementation. I still think that a class should expose as little implementation details as possible through the constructor (following the KISS principle). When I expose an implementation detail through the constructor I delegate the responsibility of the class to whatever is creating the class instead of the class itself. – Julian May 09 '11 at 13:22
  • For example, suppose my class needs to create a new File. If I want the class to be unit testable I would need to create a FileFactory interface ,implement it and pass it to the constructor. Not only am I exposing an unnecessary implementation detail. I'm also documenting through code(incorrectly) that the implementation that creates the files is intended to be mutable. I'm creating a moving part that exists for no reason other than testability. Moving parts create complexity and in my experience complexity and over-engineering create more problems than anything else. – Julian May 09 '11 at 13:28
  • If done correctly you don't open your classes for test-only code - you'd be following the Open/Closed Principle: http://blog.ploeh.dk/CommentView,guid,3cdd18ec-a1a3-4572-9a5d-4b914b0fbd7c.aspx Actually I love the example of reading and writing files, because TDD will drive us towards a more open design. Instead of coupling against the file system, we could open the API to work with any stream. Now the SUT would also work against database or network streams - much better :) – Mark Seemann May 09 '11 at 13:36
  • Yes, I understand that coupling against an interface is more general. My concern is that it's too general for non-library code. In my example we need to create and write to a file. This means that we need to create a StreamFactory/FileFactory interface and implement it in a way that this class needs. We've introduced a factory, an interface and de encapsulated some implementation of this class. This seems like entirely too much indirection and over-engineering for a simple task. If it were library code I'd say it's just the right amount of indirection.For application code it's too much. – Julian May 09 '11 at 15:08
  • 1
    Well, having done TDD since 2003 I don't make that distinction. It's not over-engineering, it's just a style of design. – Mark Seemann May 09 '11 at 16:13
2

Perhaps there is little literature because it is a false dichotomy?

TDD requires you to insert seams in your code which expose implementation details through the interface.

No, the constructor or method to inject the dependency with need not be part of the interface the calling class uses:

class Zoo {
    Animal exhibit;
}

interface Animal {
    void walk();
}

class Dog extends Animal {
    DogFood food;

    Dog(DogFood food) {
        this.food = food;
    }
}

If your class uses collaborator classes then the construction of these collaborators should be internal to the class and not exposed through the constructor or interface.

In the above example, the Zoo can not access the DogFood, as it gets the Dog after it has already been fed, and the Dog doesn't expose its food.

meriton
  • 68,356
  • 14
  • 108
  • 175
2

Mark Seemann's answer is excellent. "Units" in industry often break the Single Responsibility Principle and make the system difficult to maintain due to all the internally hard-wired dependencies. TDD exposes flaws like this nicely. Units can then be built up like Lego blocks to form larger functional units that actually perform useful business logic. Really, I see TDD as supporting building good OO systems very well.

Don't worry about hiding things so much. Think of private as meaning "you are not allowed to access this" not "the system does not currently need to access this". Use it carefully - usually for things that screw up the state of an object if accessed from outside at the wrong time.

Alan Escreet
  • 3,499
  • 2
  • 22
  • 19
1

There are a few things that can be done to integrate TDD with OOP, depending on the language in question. In Java, you can use reflection in order to test private functionality, and the test could be placed in the same package (preferably in a separate source tree) in order to test package-private functionality. Personally, I prefer testing functionality only via the public API of the code in question.

I don't know of any "official" resources on the subject, though I know that Uncle Bob has written extensively on the subject of TDD, and considers it compatible with his "SOLID" principles of OOP.

Nathan Ryan
  • 12,893
  • 4
  • 26
  • 37
1

In my experience TDD is highly supportive of the principles of object oriented design.

Dependency Injection isn't an artifact of TDD, it's commonly used in the design of OO frameworks regardless of the development methodology.

Dependency Injection is about loose coupling - if Class A uses one or more objects from Class b, a good design will minimise the knowledge class A has of the internals of class B. I believe this is what you are referring to when you mention 'information hiding'.

Consider what happens if Class B changes it's implementation. Or, a more complex but still common occasion, what if you want to dynamically substitute in different sub-classes of B depending on the situation (you might be using the Strategy pattern, for example), but the class that makes this decision is not class A.

For this reason Java has Interfaces. Instead of making class A dependent on class B, you make it dependent on the Interface which class B implements. Then you can substitute any class that implements that Interface, without changing the code inside A.

This includes, but is in no way limited to, substituting fake objects for testing purposes.

TDD makes uses of Dependency Injection, absolutely. Just like TDD makes use of many principles of OO. But DI is a principle of OO, not a principle of TDD.

KarlM
  • 1,614
  • 18
  • 28
0

When using dependency injection, the object construction and object graph management will be handled in the factory. Objects are created via factories, so the object creator would not be aware of the dependencies of the created class.

For example if class A has dependencies B and C that are passed via the constructor, B and C will be provided by the factory method. The object creating A would not be aware of the objects B and C.

Dependency injection frameworks like Google Guice and Ninject can be used automate the factory creation.

Sandeep
  • 648
  • 5
  • 12