Okay, as mentioned in my comment, I actually had this same issue. I needed to test Unfiltered endpoints, and the best way for each spec was to startup an Unfiltered server with a single endpoint, run the spec and then shutdown the server. To accomplish that, I first defined a base specification similar to this:
import org.specs2.mutable.Specification
abstract class SpecificationBase extends Specification{
//Do setup work here
step{
println("Doing setup work...")
success
}
//Include the real spec from the derived class
include(spec)
//Do shutdown work here
step{
println("Doing shutdown work...")
success
}
/**
* To be implemented in the derived class. Returns the real specification
* @return Specification
*/
def spec:Specification
}
Basically, this base class assembles the complete specification as a setup step and a teardown step with the real specification (defined in the concrete spec class) sandwiched in the middle. So a test using this base class would look like this:
class MySpec extends SpecificationBase{ def spec =
new Specification{
"A request to do something" should{
"be successful in case 1" in {
println("Testing case 1")
success
}
"be successful in case 2" in {
println("Testing case 2")
success
}
}
}
}
When you run this, you will see:
Doing setup work...
Testing case 1
Testing case 2
Doing shutdown work...
It's not perfect, but it works. Is there another (and possible cleaner/better) way to do this? Probably, but this is one solution you could look into using.