0

We have a number of builds that rely on each other (e.g. Build A has to run before Build B because Build B references .dlls created in Build A), so in the Source Settings of each build we hard-code the Build Agent Folder. It can't be the default $(SystemDrive)\Builds\$(BuildAgentId)\$(BuildDefinitionPath) because then subsequent builds wouldn't know where to get their source. But now I've set up CI builds and I'm often getting the error:

Exception Message: Unable to create the workspace '106_33_pgbuildorig' due to 
a mapping conflict. You may need to manually delete an old workspace.

I initially tried setting the CI builds to use a different folder, but it turns out that we need them to be in the same folder as well, because we want to pick up the latest output from a CI build in subsequent other builds.

Any ideas how I can avoid having to manually delete the workspaces created by Team Build so often?

Actually I don't know how these builds worked initially (I just started working here) since it would seem like hard-coding the Source Settings would cause workspaces to be created that overlapped and would fail on their next run anyway.

Ben_G
  • 770
  • 2
  • 8
  • 30

1 Answers1

1

You should stop having builds that rely on each other like this. Other than the obvious problem you're having, it's just a generally bad practice because it results in builds that generate binaries that don't necessarily behave consistently. If I recompile older source code, I should get output that exhibits the same behavior. If you're relying on ever-changing binaries generated externally, you can't guarantee that. You can't even guarantee old code will compile.

It also makes it difficult (bordering on impossible) to effectively scale your build infrastructure beyond a single build agent.

The better solution depends a bit on your scenario, but roughly speaking:

  • If the reason you're doing this is because you have multiple applications that all rely on a shared set of components, use NuGet packages to share versioned binaries between different applications that need to consume those binaries.
  • If the reason is for build speed (it's a big application, and you don't want to rebuild X+Y when just Y changes), use NuGet packages.
  • If neither of the above, just build everything at once.
Daniel Mann
  • 57,011
  • 13
  • 100
  • 120
  • Thanks Daniel. I know this isn't a good situation, but like I said I just started here, so it's going to take some time to clean things up. I can't even figure out what order to build things in to get them to work! There are something like 215 solution files, 25 or so are referenced in the 12 or so builds, but almost every solution includes references to other .csproj files or .dlls generated from other builds. – Ben_G Jul 24 '17 at 22:08
  • Just curious Daniel - how would NuGet packages help in this situation? My only experience with NuGet is 3rd-party add-ons. – Ben_G Jul 24 '17 at 22:21
  • @Ben_G Your components can be built and published separately to a NuGet feed, then referenced by each dependent application and restored during build. In a way, it's just formalizing what you're currently doing -- binaries are created and published so they can be referenced by other applications that need them. The difference is that NuGet packages are versioned and can be restored from a known location as necessary,. – Daniel Mann Jul 24 '17 at 22:51
  • thanks for the tip. I'll definitely look into this. Do you know of anywhere that explains how to do what you're suggesting? I've found plenty of docs describing NuGet, but nothing yet that describes building your own .dlls, saving them and then referencing them in other builds. – Ben_G Jul 25 '17 at 16:00
  • @Ben_G Push for an upgrade to TFS 2017 or a migration to VS Team Services; there is an integrated NuGet package feed and a new build system that makes packaging and publishing NuGet packages very easy. – Daniel Mann Jul 25 '17 at 16:43
  • I started to look into using NuGet. I opened one of the solutions and started deleting projects (.csproj files) and replacing them with NuGet packages (that I generated separately). How is this going to help the initial problem of overlapping workspaces? And I know this isn't a thread about NuGet, but I'm hoping you can answer a few questions: 1) Every time someone needs to modify the code included in a NuGet package they need to open that project, build it locally and regenerate the NuGet package? That's going to be very hard to sell to the team. – Ben_G Jul 26 '17 at 20:44
  • ...and 2) how would those new NuGet packages be updated in subsequent builds of the parent solution? I see when right-clicking on a NuGet package that it's been copied locally, so do I need to re-import the package to get the updated version? – Ben_G Jul 26 '17 at 20:46
  • This is going to help the problem of overlapping workspaces by eliminating the dependency problem entirely. Instead of Build A relying on the output of Build B being at some arbitrary location in your file system, Build A can pull down a specific version of the **results** of Build B on-demand during its own build process. You can avoid pain during development by using conditional project references, as explained here: https://stackoverflow.com/a/27711522/781754 – Daniel Mann Jul 26 '17 at 20:55