I manage a small build farm comprising a Jenkins 1.651.3 master running on Windows Server 2008, and a dozen slaves (nodes) running on Windows 7 PCs.
I made a Jenkins multi-configuration job to refresh a build scripts folder on all 12 slave machines every night. The job just checks out files from an SVN repository to a path C:\Dev\Build; it doesn't compile or do anything else with the files.
I selected "Executes a Windows batch command" as the Build step to run a simple command e.g. "svn checkout https://mysvn/build C:\Dev\Build".
I then selected the twelve slaves as "individual nodes" in the Slaves axis of the Configuration Matrix.
The slaves are all configured with single executors, so only one Jenkins job runs at a time (this is how they were configured when I inherited this setup).
So here's the problem: If say "Job A" from this multi-configuration project stalls on "Node X" because the executor on Node X is busy running a job from a different Jenkins project, it appears that Jobs B, C, D, etc. wait until stalled Job A can run, even if the other nodes Y, Z, etc. are idle. I can see a gap of several hours between Job A on Node X and Job B on Node Y - this doesn't make sense to me, there's no dependency between Job A and Job B.
Am I interpreting this behaviour incorrectly? Shouldn't each job run independently of its "siblings"?
Is there a better way to have Jenkins run one simple svn checkout job on multiple nodes at the same time, without having to wait for each job to finish before the next one runs?