36

For a customer we need to generate detailed test reports for integration tests which not only show, that everything is green, but also what the test did. My colleagues and I are lazy guys and we do not want to hack spreadsheets or text documents.

For that, I think about a way to document the more complex integration tests with JavaDoc comments on each @Test annotated method and each test class. For the test guys it is a good help to see to which requirement, Jira ticket or whatever the test is linked to and what the test actually tries to do. We want to provide this information to our customer, too.

The big question now is: How can we put the JavaDoc for each method and each test class into the JUnit reports? We use JUnit 4.9 and Maven.

I know, that there is a description for each assertXXX(), but we really would need a nice HTML list as result or a PDF document which lists all classes and there documentation and below that all @Test methods and their description, the testing time, the result and if failed, the reason why.

Or is there another alternative to generate fancy test scripts? (Or should we start an OpenSource project on this!? ;-) )

Update: I asked another question on how to add a RunListener to Eclipse to have it also report in Eclipse when started there. The proposed solution with a custom TestRunner is another possibility to have the test results report. Have a look: How can I use a JUnit RunListener in Eclipse?

Community
  • 1
  • 1
Rick-Rainer Ludwig
  • 2,371
  • 1
  • 26
  • 42
  • Maven Surefire Plugin generates xml have you considered writing a xslt to generate report in a format that you require? – Prashant Bhate Nov 18 '11 at 04:16
  • 1
    I finally found the time to document my own solution which is based on `RunListener` and custom annotations at http://frightanic.com/software-development/adding-references-junit-tests/ – Marcel Stör Oct 22 '14 at 22:05

6 Answers6

22

One way to achieve this would be to use a custom RunListener, with the caveat that it would be easier to use an annotation rather than javadoc. You would need to have a custom annotation such as:

@TestDoc(text="tests for XXX-342, fixes customer issue blahblah")
@Test
public void testForReallyBigThings() {
    // stuff
}

RunListener listens to test events, such as test start, test end, test failure, test success etc.

public class RunListener {
    public void testRunStarted(Description description) throws Exception {}
    public void testRunFinished(Result result) throws Exception {}
    public void testStarted(Description description) throws Exception {}
    public void testFinished(Description description) throws Exception {}
    public void testFailure(Failure failure) throws Exception {}
    public void testAssumptionFailure(Failure failure) {}
    public void testIgnored(Description description) throws Exception {}
}

Description contains the list of annotations applied to the test method, so using the example above you can get the Annotation TestDoc using:

description.getAnnotation(TestDoc.class);

and extract the text as normal.

You can then use the RunListener to generate the files you want, with the text specific to this test, whether the test passed or failed, or was ignored, the time taken etc. This would be your custom report.

Then, in surefire, you can specify a custom listener, using:

<plugin>
  <groupId>org.apache.maven.plugins</groupId>
  <artifactId>maven-surefire-plugin</artifactId>
  <version>2.10</version>
  <configuration>
    <properties>
      <property>
        <name>listener</name>
        <value>com.mycompany.MyResultListener,com.mycompany.MyResultListener2</value>
      </property>
  </configuration>
</plugin>

This is from Maven Surefire Plugin, Using JUnit, Using custom listeners and reporters

This solution has the disadvantage that you don't have the flexibility of javadoc as far as carriage returns, formatting is concerned, but it does have the advantage that the documentation is in one specific place, the annotation TestDoc.

Marcel Stör
  • 22,695
  • 19
  • 92
  • 198
Matthew Farwell
  • 60,889
  • 18
  • 128
  • 171
  • This idea is actual quite good. For writing something new, the amount of work is quite small. I checked the idea and tried some things out. It is better than nothing. ;-) One could create some text documents first and later one could extend it to write HTML or something. – Rick-Rainer Ludwig Nov 19 '11 at 16:45
  • You can write text documents, or xml or html, anything. In your case it may be best to write simple XML and then transform it to something prettier as a later part of the process. That way the RunListener stays simple. – Matthew Farwell Nov 19 '11 at 18:24
  • This is the way we are going to go at first. The amount of time to be implemented is not so much and it is simple and clear to understand for others. – Rick-Rainer Ludwig Nov 22 '11 at 19:48
  • Glad you found what you needed in the end...for now :P – Patrick Magee Nov 22 '11 at 22:31
  • @MatthewFarwell, thanks for the valuable information. I wish there was a way a custom `RunListener` could contribute to the XML created by the surefire plugin :( Because it cannot a listener will have to put its output "somewhere" and a post-processing task will have to merge it with the XML. – Marcel Stör Apr 16 '14 at 13:04
  • There is another solution with a custom JUnit TestRunner. Have a look here: http://stackoverflow.com/questions/10537495/how-can-i-use-a-junit-runlistener-in-eclipse – Rick-Rainer Ludwig Jan 26 '15 at 18:58
5

Have you looked at Maven Sure-fire reports?

You can generate a HTML report from your JUnit Tests.

http://maven.apache.org/plugins/maven-surefire-report-plugin/

I'm not sure how customizable it is though. But it's a good starting point.

I also know that TestNG ( alternative to JUnit ) has some report generating capabilities. http://testng.org/doc/documentation-main.html#logging-junitreports

I would also recommend log4j http://logging.apache.org/log4j/1.2/manual.html

Patrick Magee
  • 2,951
  • 3
  • 33
  • 50
  • I do not now any possibility to put any other documentation into the surefire report plugin. Do you know any? The only way is to put code cross references in, but that is not what I need, because this would only work for failed tests, as far as I know. TestNG is not an option. We are bound to JUnit here. We have a lot of legacy code. Log4J is in use, but it's nothing we could put into reports nicely, I am afraid. But, thanks for you answer anyway. – Rick-Rainer Ludwig Nov 10 '11 at 20:23
  • I do not, sorry. I can defintialy say there is a lack of a good open source report generator for use with Unit Testing. Maybe you are after something more BDD. Like JBehave: http://jbehave.org/ Where you can write full user storys and have them integrated into testing your code. Then generate huge reports based on the user stories. http://blog.codecentric.de/en/2011/03/automated-acceptance-testing-using-jbehave/ Certainly not an answer, but something you could look into. Hope you find what you are looking for. – Patrick Magee Nov 10 '11 at 20:30
  • JBehave looks fine, but for another purpose. It gives me another idea to think about.That's something for the project people for sprint planning and monitoring. (+1 for that ;-)) I work in testing and QA and I look for something to satisfy the wish of our customers for nice reports for testing and for our internal customers and developers to have a quick look on failed test with a hint what went wrong. I am very short before writing a script myself to enrich the XML output of surefire. It just depends on the answers coming in on this questions within the next three days... ;-) – Rick-Rainer Ludwig Nov 10 '11 at 20:38
2

you can use jt-report an excellent framework for test reporting.

lols
  • 51
  • 2
1

I have created a program using testNG and iText which outputs the test results in a nice pdf report. You can put a description of your test in the @Test tag, and that can be included in the .pdf report also. It provides the run times of the tests, and for the entire suite. It is currently being used to test webapps with selenium, but that part could be ignored. It also allows you to run multiple test suites in one run, and if tests fail, it allows you to re-run only those tests without having to re-run the entire suite, and those results will be appended to the original results PDF. See below the image for a link to the source if you are interested. I wouldn't mind this becoming an opensource project as I have a good start on it, though I'm not sure how to go about doing that. Here's a screenshot enter image description here

So I figured out how to create a project on sourceforge. Here's the link sourceforge link

Reid Mac
  • 2,411
  • 6
  • 37
  • 64
  • 1
    This looks promising, but I can not have a closer look now. I check details tomorrow. But, I am not an expert in TestNG: Is is compatible with JUnit? Can it work with Maven, too? This would be the optimum. ;-) I forgot to mention Maven in my question before. :-( I edited the question for that. – Rick-Rainer Ludwig Nov 16 '11 at 21:27
  • 1
    That's fine, TestNG is compatible with Maven. [TestNG and Maven](http://testng.org/doc/maven.html). I am currently running my tests using eclipse. It is compatible with JUnit, the plugin for eclipse actually provides an option to convert JUnit tests to TestNG tests. They essentially have the same format, except they use different libraries for the Annotations, but the actual annotation itself is the same except for the Before/After tags, but there is an example in the project on sourceforge which you can take a look at. Hope this helps. – Reid Mac Nov 17 '11 at 13:11
  • TestNG is like JUnit, it is a unit testing framework, similar to JUnit 4. But more customizable and it can also be used with Junit for reports. – Patrick Magee Nov 20 '11 at 17:51
0

As mentioned above maven is definitely the way to go.. It makes life really easy. You can create an maven project pretty easy using m2eclipse plugin. Once that is done. Just run these commands:

cd <project_dir_where_you_have_pom_file>
mvn site:site

This command will create the style sheets for you. In the same directory run:

mvn surefire-report:report

This will run the test cases and convert the output to html. You can find the output in the 'target/site/surefire-report.html'.

Below is the snippet. As you can see all the test cases (written in JUnit) are shown in the html. Other meta info like total no of test cases ran, how many successful, time taken etc., is also there.

Since I cannot upload image I cant show you the output.

You can go a step further and can give the exact version of the plugin to use like

mvn org.apache.maven.plugins:maven-site-plugin:3.0:site org.apache.maven.plugins:maven-surefire-report-plugin:2.10:report
havexz
  • 9,550
  • 2
  • 33
  • 29
  • I already know Maven and I use it, too. But, the Surefire reports look very simple and do not contain enough information. I would really like to see some human readable and interpretable descriptions there for people which are not directly involved in software development. They should also understand what a test tries to tackle... – Rick-Rainer Ludwig Nov 20 '11 at 10:45
  • I looked into the maven plugins. surefire-report just uses the Test-*.xml files to generate the report. Now those xml files contain no extra info. So to achieve what you want you have to first customize the surefire plugin to add some extra fields (fields can be part of annotation) in the xmls and then have your surefire-report to accept those fields. OR...the solution above (by Matthew Farwell) is also good enough. – havexz Nov 20 '11 at 19:34
  • snippet manually edited to add description field. ` ` once this is done we have to modify the surefire-report – havexz Nov 20 '11 at 20:11
  • I do not like the idea to patch the surefire report plugin. With every new version of the plugin we would also need to apply our changes. Maybe, someday, the guys from the plugin change something more essential (e.g. the architecture), we would need to make also some greater adoptions... That's not really a way to go for us. At the moment the idea by Matthew Farwell is my favorite. – Rick-Rainer Ludwig Nov 21 '11 at 16:01
  • I agree ...I also dont like patching...I was just wondering if they provide some hooks without opening their source code. Like one mentioned above for Listeners...that can help. BTW one thing to add if you use Listener approach you might loose the aggregation of test results that the default one does. – havexz Nov 21 '11 at 17:48
0

Maybe it is worth taking a look on "executable specification" / BDD tools like FIT/FitNesse, Concordion, Cucumber, JBehave etc.

With this practice you will have a possibility not only satisfy the customer's requirement formally, but you will be able do bring transparency onto a new level.

Shortly speaking, all these tools allow you (or, better, customer) to define scenarios using natural language or tables, define binding of natural language constructs to real code, and run these scenarios and see if they succeed or fail. Actually you will have a "live" spec which shows what is already working as expected and what is not.

See a good discussion on these tools: What are the differences between BDD frameworks for Java?

Community
  • 1
  • 1
Alexey Tigarev
  • 1,915
  • 1
  • 24
  • 31