Take this basic build pipeline (with gradle tasks):
- Compile/Run Unit tests (gradle clean build)
- Integration tests (gradle integrationTest)
- Acceptance tests (gradle acceptanceTest)
- Deploy (gradle myCustomDeployTask)
According to Jez Humble's "Continuous Delivery" book, you should only build your binaries once. So in the above theoretical pipeline, in step 1 we clean, compile, and build the WAR, in step 2 we run the integration tests (using the compiled code from step 1), in step 3 we run the acceptance tests (using the compiled code from step 1), and in step 4 we deploy the WAR (that was built in step 1). So far so good.
I'm trying to implement this pipeline in Jenkins. Because each job has it's own workspace, steps 2, 3, & 4 end up recompiling the code and building the WAR, which violates the "Continuous Delivery" mantra of only building your binaries once.
To combat this, I used the "Clone Workspace SCM" Jenkins plugin, which will zip up the workspace from the first build and be the source of the workspace for builds 2, 3, & 4. However, gradle still recompiles the code in each step because it apparently uses the absolute path of the files to determine if a task needs to be executed. Since the plugin moved the files to a new workspace, the absolute path changed, which makes gradle think it needs to start from the beginning rather than executing an incremental build.
Now we could share workspaces in Jenkins, but that's frowned upon as well because of the possibility of two jobs running against the shared workspace.
So how does one implement the above pipeline using Jenkins and Gradle while adhering to the best practices of Continuous Delivery, Jenkins, and Gradle?