133

We are getting very slow compile times, which can take upwards of 20+ minutes on dual core 2GHz, 2G Ram machines.

A lot of this is due to the size of our solution which has grown to 70+ projects, as well as VSS which is a bottle neck in itself when you have a lot of files. (swapping out VSS is not an option unfortunately, so I don't want this to descend into a VSS bash)

We are looking at merging projects. We are also looking at having multiple solutions to achieve greater separation of concerns and quicker compile times for each element of the application. This I can see will become a DLL hell as we try to keep things in synch.

I am interested to know how other teams have dealt with this scaling issue, what do you do when your code base reaches a critical mass that you are wasting half the day watching the status bar deliver compile messages.

UPDATE I neglected to mention this is a C# solution. Thanks for all the C++ suggestions, but it's been a few years since I've had to worry about headers.

EDIT:

Nice suggestions that have helped so far (not saying there aren't other nice suggestions below, just what has helped)

  • New 3GHz laptop - the power of lost utilization works wonders when whinging to management
  • Disable Anti Virus during compile
  • 'Disconnecting' from VSS (actually the network) during compile - I may get us to remove VS-VSS integration altogether and stick to using the VSS UI

Still not rip-snorting through a compile, but every bit helps.

Orion did mention in a comment that generics may have a play also. From my tests there does appear to be a minimal performance hit, but not high enough to sure - compile times can be inconsistent due to disc activity. Due to time limitations, my tests didn't include as many Generics, or as much code, as would appear in live system, so that may accumulate. I wouldn't avoid using generics where they are supposed to be used, just for compile time performance

WORKAROUND

We are testing the practice of building new areas of the application in new solutions, importing in the latest dlls as required, them integrating them into the larger solution when we are happy with them.

We may also do them same to existing code by creating temporary solutions that just encapsulate the areas we need to work on, and throwing them away after reintegrating the code. We need to weigh up the time it will take to reintegrate this code against the time we gain by not having Rip Van Winkle like experiences with rapid recompiling during development.

Michael Currie
  • 13,721
  • 9
  • 42
  • 58
johnc
  • 39,385
  • 37
  • 101
  • 139
  • Wow I thought 20 second compile times were infuriatingly long. – Jared Updike Feb 03 '11 at 02:20
  • Try to advoid multiple solutions if at all possible, as refactoring becomes so much harder. – Ian Ringrose Mar 25 '11 at 12:25
  • You could use VSS outside of visual-studio that way you don’t get the overhead of visual-studio talking to VSS. – Ian Ringrose Mar 25 '11 at 12:36
  • How about the resources ? I can imagine they slow down the process. I've seen commercial software with exe files the size of CDs that you start from CD (not setup). They were full of videos, audio and pictures. So the software was just this one file.... – Bitterblue Jul 22 '14 at 13:09

34 Answers34

74

The Chromium.org team listed several options for accelerating the build (at this point about half-way down the page):

In decreasing order of speedup:

  • Install Microsoft hotfix 935225.
  • Install Microsoft hotfix 947315.
  • Use a true multicore processor (ie. an Intel Core Duo 2; not a Pentium 4 HT).
  • Use 3 parallel builds. In Visual Studio 2005, you will find the option in Tools > Options... > Projects and Solutions > Build and Run > maximum number of parallel project builds.
  • Disable your anti-virus software for .ilk, .pdb, .cc, .h files and only check for viruses on modify. Disable scanning the directory where your sources reside. Don't do anything stupid.
  • Store and build the Chromium code on a second hard drive. It won't really speed up the build but at least your computer will stay responsive when you do gclient sync or a build.
  • Defragment your hard drive regularly.
  • Disable virtual memory.
Nate
  • 693
  • 6
  • 8
  • 30
    By disable virtual memory I assume you mean disable swap, disabling virtual memory would require a rewrite of the entire OS ;p – Joseph Garvin Jul 21 '10 at 16:05
  • 9
    This look like an answer aimed at C++ builds not C# builds – Ian Ringrose Mar 25 '11 at 12:26
  • 2
    You're right! Though I should point out that I replied before he specified C#, and some of the fixes still apply. – Nate Mar 25 '11 at 18:39
  • * Store the project on an SSD drive * Disable windows indexing (in a file manager, right click solution folder, Properties->Advanced, untick the "Allow files ... indexed ...") – nos Sep 24 '14 at 07:41
  • +1 If you have enough RAM them keep project in RAM disk. It can improve performance dramatically upto 50-70%. check http://www.codeproject.com/Articles/197663/Speed-up-Visual-Studio-Builds for more information – Arjun Vachhani Apr 04 '15 at 11:17
59

We have nearly 100 projects in one solution and a dev build time of only seconds :)

For local development builds we created a Visual Studio Addin that changes Project references to DLL references and unloads the unwanted projects (and an option to switch them back of course).

  • Build our entire solution once
  • Unload the projects we are not currently working on and change all project references to DLL references.
  • Before check-in change all references back from DLL to project references.

Our builds now only take seconds when we are working on only a few projects at a time. We can also still debug the additional projects as it links to the debug DLLs. The tool typically takes 10-30 seconds to make a large number of changes, but you don't have to do it that often.

Update May 2015

The deal I made (in comments below), was that I would release the plugin to Open Source if it gets enough interest. 4 years later it has only 44 votes (and Visual Studio now has two subsequent versions), so it is currently a low-priority project.

iCollect.it Ltd
  • 92,391
  • 25
  • 181
  • 202
  • 2
    Also used this technique, with a solution having 180 projects. This helped a lot. You can even use the command line to build the entire solution `devenv.exe /build yoursolution /takealookatthedoc``... so you work with only few projects, and when required, you recompile the whole solution in a cmd line (after a get latest version for exemple) – Steve B Sep 17 '11 at 10:06
  • Do you have any links that describe how this is done? I don't mean writing a VS plugin. Rather, the specific tasks described – Daniel Dyson Feb 11 '12 at 15:54
  • @Daniel Dyson: How detailed do you need to know? It all comes down to 1) loading any unloaded projects 2) iterating the solution/project/reference hierarchy 3) finding projects with references to other projects 4) changing the "chosen" references to DLL references (with correct hint paths) then 5) unloading the unwanted projects. "Chosen" is either via content menu (i.e. the selected projects(s)) or via a checkbox tree to select items. – iCollect.it Ltd Feb 13 '12 at 09:21
  • Thanks. That should be enough to get me started. – Daniel Dyson Feb 13 '12 at 12:51
  • @HiTechMagic it would be nice to publish your addin :) – Michel Feb 22 '12 at 08:07
  • @Michel: I made an agreement with someone that if this answer hits a decent count I will release a version of the add-in for general use... it has however been sitting here for 7 months and has a count of 5 to-date... Maybe Daniel Dyson (above) will beat me to it :) – iCollect.it Ltd Feb 22 '12 at 11:39
  • @HiTechMagic +1 I'd love to have a copy of that plugin if you're able to share it. – Brandon Moore Jun 15 '12 at 07:40
  • @HiTechMagic Sounds good. You have a website where I can check for its release or am I just gonna have to write comments here every month asking? :) – Brandon Moore Jun 16 '12 at 22:26
  • @Brandon Moore: Website link should be on my user details. Thanks – iCollect.it Ltd Jun 17 '12 at 10:22
  • @HiTechMagic - I too would love to see this addin. (Even if it is not polished, it would be a very nice starting point.) – Vaccano Feb 04 '13 at 18:35
  • Work is now in progress to update and release this tool. Another SO user has recently joined the project to get it moving (for VS 2012 as well as 2012, but no plans for a 2008 version). – iCollect.it Ltd Mar 22 '13 at 08:48
  • @georgiosd: my fellow SO user stopped communicating once he had access to the source code, so the project has not moved on. Go figure. I might see about releasing it as open source instead (I will need to ensure no 3rd party code is left in it). – iCollect.it Ltd Aug 11 '13 at 00:18
  • 1
    @HiTechMagic Ah, sorry to hear that. But yes, releasing it as open source means we can all help. Please post the github link here if you do release it. – georgiosd Aug 11 '13 at 08:40
  • Did this plugin ever come out? – Rohit Vipin Mathews May 11 '15 at 06:34
  • @Rohit: It's nearly 4 years and only 44 up-votes, so it has not been a priority. The 3 colleagues we hired, that could have assisted, all had melt-downs and left :( I dislike the idea of dumping something on GitHub that is in a poor state, but that may be the only option now. – iCollect.it Ltd May 11 '15 at 07:03
  • That's sad, I thought it was a great idea. – Rohit Vipin Mathews May 11 '15 at 07:11
24

I had a similar issue on a solution with 21 projects and 1/2 million LOC. The biggest difference was getting faster hard drives. From the performance monitor the 'Avg. Disk Queue' would jump up significantly on the laptop indicating the hard drive was the bottle neck.

Here's some data for total rebuild times...

1) Laptop, Core 2 Duo 2GHz, 5400 RPM Drive (not sure of cache. Was standard Dell inspiron).

Rebuild Time = 112 seconds.

2) Desktop (standard issue), Core 2 Duo 2.3Ghz, single 7200RPM Drive 8MB Cache.

Rebuild Time = 72 seconds.

3) Desktop Core 2 Duo 3Ghz, single 10000 RPM WD Raptor

Rebuild Time = 39 seconds.

The 10,000 RPM drive can not be understated. Builds where significantly quicker plus everything else like displaying documentation, using file explorer was noticable quicker. It was a big productivity boost by speeding the code-build-run cycle.

Given what companies spend on developer salaries it is insane how much they can waste buy equiping them with the same PCs as the receptionist uses.

  • 4
    How would a SSD compare to the raptor. Even faster i gues – RvdK May 27 '10 at 08:48
  • 4
    Yup. My Laptop with an Intel X25M is faster in all aspects than my desktop with a WD Raptor. – CAD bloke Sep 04 '10 at 04:56
  • 4
    It might sound surprising, but it currently isn't worth investing into a 10000 RPM drive. The reason is that the better 7200 RPM drives are faster at the outer rim. So, what one must do is create a small partition. The first partition is at the outer rim, this partition will be faster than a 7200 RPM drive, plus you still have space for a second large partition to store things on. – darklon Sep 16 '10 at 12:22
  • Alternatively, you can also host the solution in a RAM disk and halve those rebuild times. Just commit often ;) – keyle Jun 06 '11 at 09:25
  • 2
    @cornelius: can I get a link that elaborates your point? The only way the outer rim of a 7200 could be faster than the outer rim of a 10000 would be if the 7200s tended to have radius, which maybe could be, but really this trick would be sort of a hack and wouldn't provide benefit for the rest of the hard drive storage on the 7200 that is below the equilibrium radius at which the two drives have equal tangential velocity. – eremzeit Aug 09 '11 at 18:55
  • 2
    I'm with CADbloke. We used raptors until last year when the price point lowered on SSDs to the point that we only use SSDs for the primary drives in our laptops/desktops. The speed increase is fantastic and is easily the single biggest factor in how long it takes to compile our solutions. – NotMe Dec 18 '13 at 14:54
16

For C# .NET builds, you can use .NET Demon. It's a product that takes over the Visual Studio build process to make it faster.

It does this by analyzing the changes you made, and builds only the project you actually changed, as well as other projects that actually relied on the changes you made. That means if you only change internal code, only one project needs to build.

Alex Davies
  • 171
  • 2
  • 5
14

Turn off your antivirus. It adds ages to the compile time.

jdelator
  • 4,101
  • 6
  • 39
  • 53
  • 2
    ... for the code/compile folder. Turning of AV protection as a blanket-coverage rule isn't a brilliant idea. :o) – Brett Rigby Feb 05 '10 at 09:10
  • 5
    You don't really need to turn it off, configuring it properly is usually enough. Add exceptions to the file types the compiler/linker works with. Some antivirus packages have these exceptions added by default, some don't. – darklon Sep 16 '10 at 12:32
  • @cornelius What is the proper anti-virus configuration? Can you provide details? (maybe in a separate question?) – Pavel Radzivilovsky Jan 30 '11 at 16:22
  • @Pavel: Well, exclude file types that the compiler works with, for C++ that would be things like .o, .pdb, .ilk, .lib, .cpp, .h. Also, some antivirus software (eg. Avira AntiVir) allows you to set to scan files on read, write or both. Setting it to scan on read will give you 99% protection. – darklon Feb 10 '11 at 22:25
12

Use distributed compilation. Xoreax IncrediBuild can cut compilation time down to few minutes.

I've used it on a huge C\C++ solution which usually takes 5-6 hours to compile. IncrediBuild helped to reduce this time to 15 minutes.

aku
  • 122,288
  • 32
  • 173
  • 203
  • Installing IncrediBuild on several spare PCs reduced compile time by factor 10 or more for our C++ project with almost no administration effort. – Stiefel Mar 08 '11 at 12:19
  • had the same experience aku... however link was still an issue hehe – Paul Carroll May 31 '11 at 01:37
  • If you are going this route, then simply having a couple dedicated build servers would work. However it looks like the OP was trying to fix build times on the local dev machines. – NotMe Dec 18 '13 at 14:55
11

Instructions for reducing your Visual Studio compile time to a few seconds

Visual Studio is unfortunately not smart enough to distinguish an assembly's interface changes from inconsequential code body changes. This fact, when combined with a large intertwined solutions, can sometimes create a perfect storm of unwanted 'full-builds' nearly every time you change a single line of code.

A strategy to overcome this is to disable the automatic reference-tree builds. To do this, use the 'Configuration Manager' (Build / Configuration Manager...then in the Active solution configuration dropdown, choose 'New') to create a new build configuration called 'ManualCompile' that copies from the Debug configuration, but do not check the 'Create new project configurations' checkbox. In this new build configuration, uncheck every project so that none of them will build automatically. Save this configuration by hitting 'Close'. This new build configuration is added to your solution file.

You can switch from one build configuration to another via the build configuration dropdown at the top of your IDE screen (the one that usually shows either 'Debug' or 'Release'). Effectively this new ManualCompile build configuration will render useless the Build menu options for: 'Build Solution' or 'Rebuild Solution'. Thus, when you are in the ManualCompile mode, you must manually build each project that you are modifying, which can be done by right-clicking on each affected project in the Solution Explorer, and then selecting 'Build' or 'Rebuild'. You should see that your overall compile times will now be mere seconds.

For this strategy to work, it is necessary for the VersionNumber found in the AssemblyInfo and GlobalAssemblyInfo files to remain static on the developer's machine (not during release builds of course), and that you don't sign your DLLs.

A potential risk of using this ManualCompile strategy is that the developer might forget to compile required projects, and when they start the debugger, they get unexpected results (unable to attach debugger, files not found, etc.). To avoid this, it is probably best to use the 'Debug' build configuration to compile a larger coding effort, and only use the ManualCompile build configuration during unit testing or for making quick changes that are of limited scope.

Sean
  • 1
  • 1
  • 2
8

If this is C or C++, and you're not using precompiled headers, you should be.

Kristopher Johnson
  • 81,409
  • 55
  • 245
  • 302
  • If pre-compiled headers work for you, then they're a good trick, but they only work if you can establish a strong common subset of headers that rarely change. If you keep having to precompile the headers most of the time that you build, then you aren't saving anything. – Tom Swirly Apr 29 '11 at 16:54
7

We had a 80+ projects in our main solution which took around 4 to 6 minutes to build depending on what kind of machine a developer was working. We considered that to be way too long: for every single test it really eats away your FTEs.

So how to get faster build times? As you seem to already know it is the number of projects that really hurt the buildtime. Of course we did not want to get rid of all our projects and simply throw all sourcefiles into one. But we had some projects that we could combine nevertheless: As every "Repository project" in the solution had its own unittest project, we simply combined all the unittest projects into one global-unittest project. That cut down the number of projects with about 12 projects and somehow saved 40% of the time to build the entire solution.

We are thinking about another solution though.

Have you also tried to setup a new (second) solution with a new project? This second solution should simply incorporates all files using solution folders. Because you might be surprised to see the build time of that new solution-with-just-one-project.

However, working with two different solutions will take some carefull consideration. Developers might be inclined to actually -work- in the second solution and completely neglect the first. As the first solution with the 70+ projects will be the solution that takes care of your object-hierarchy, this should be the solution where your buildserver should run all your unittests. So the server for Continous Integration must be the first project/solution. You have to maintain your object-hierarchy, right.

The second solution with just one project (which will build mucho faster) will than be the project where testing and debugging will be done by all developers. You have to take care of them looking at the buildserver though! If anything breaks it MUST be fixed.

Hace
  • 1,421
  • 12
  • 17
7

Make sure your references are Project references, and not directly to the DLLs in the library output directories.

Also, have these set to not copy locally except where absolutely necessary (The master EXE project).

GeekyMonkey
  • 12,478
  • 6
  • 33
  • 39
  • Can you explain why this is faster? "Project references" implies project building (which is much *slower* than a direct DLL reference). – iCollect.it Ltd May 11 '15 at 10:01
6

I notice this question is ages old, but the topic is still of interest today. The same problem bit me lately, and the two things that improved build performance the most were (1) use a dedicated (and fast) disk for compiling and (2) use the same outputfolder for all projects, and set CopyLocal to False on project references.

Some additional resources:

Community
  • 1
  • 1
Thomas K
  • 11
  • 1
  • 2
6

I posted this response originally here: https://stackoverflow.com/questions/8440/visual-studio-optimizations#8473 You can find many other helpful hints on that page.

If you are using Visual Studio 2008, you can compile using the /MP flag to build a single project in parallel. I have read that this is also an undocumented feature in Visual Studio 2005, but have never tried myself.

You can build multiple projects in parallel by using the /M flag, but this is usually already set to the number of available cores on the machine, though this only applies to VC++ I believe.

Community
  • 1
  • 1
Ed S.
  • 122,712
  • 22
  • 185
  • 265
  • 1
    Be careful. There is a bug in 2008 that truncates builds. MS say they will not fix until vs2010. This is a terrible issue because it truncates .obj files causing persistent confusing build issues. you have been warned. ;) http://social.msdn.microsoft.com/forums/en-US/vcgeneral/thread/4cb88018-c05e-450d-893d-e2d9e4f22ec8 – Justin Aug 22 '12 at 02:00
5

Some analysis tools:

tools->options->VC++ project settings -> Build Timing = Yes will tell you build time for every vcproj.

Add /Bt switch to compiler command line to see how much every CPP file took

Use /showIncludes to catch nested includes (header files that include other header files), and see what files could save a lot of IO by using forward declarations.

This will help you optimize compiler performance by eliminating dependencies and performance hogs.

Pavel Radzivilovsky
  • 18,794
  • 5
  • 57
  • 67
5

Before spending money to invest in faster hard drives, try building your project entirely on a RAM disk (assuming you have the RAM to spare). You can find various free RAM disk drivers on the net. You won't find any physical drive, including SSDs, that are faster than a RAM disk.

In my case, a project that took 5 minutes to build on a 6-core i7 on a 7200 RPM SATA drive with Incredibuild was reduced by only about 15 seconds by using a RAM disk. Considering the need to recopy to permanent storage and the potential for lost work, 15 seconds is not enough incentive to use a RAM disk and probably not much incentive to spend several hundreds of dollars on a high-RPM or SSD drive.

The small gain may indicate that the build was CPU bound or that Windows file caching was rather effective, but since both tests were done from a state where the files weren't cached, I lean heavily towards CPU-bound compiles.

Depending on the actual code you're compiling your mileage may vary -- so don't hesitate to test.

Jonathan
  • 41
  • 1
  • 2
  • RAM disks do not help VS build times in my experience, and they are a pain because you have to recreate them every time your computer reboots or crashes. See blog post here: http://josephfluckiger.blogspot.com/2009/02/speed-up-visual-studio-with-ram-disk.html – BrokeMyLegBiking Sep 15 '11 at 11:56
3

How big is your build directory after doing a complete build? If you stick with the default setup then every assembly that you build will copy all of the DLLs of its dependencies and its dependencies' dependencies etc. to its bin directory. In my previous job when working with a solution of ~40 projects my colleagues discovered that by far the most expensive part of the build process was copying these assemblies over and over, and that one build could generate gigabytes of copies of the same DLLs over and over again.

Here's some useful advice from Patrick Smacchia, author of NDepend, about what he believes should and shouldn't be separate assemblies:

http://codebetter.com/patricksmacchia/2008/12/08/advices-on-partitioning-code-through-net-assemblies/

There are basically two ways you can work around this, and both have drawbacks. One is to reduce the number of assemblies, which is obviously a lot of work. Another is to restructure your build directories so that all your bin folders are consolidated and projects do not copy their dependencies' DLLs - they don't need to because they are all in the same directory already. This dramatically reduces the number of files created and copied during a build, but it can be difficult to set up and can leave you with some difficulty pulling out only the DLLs required by a specific executable for packaging.

Weeble
  • 17,058
  • 3
  • 60
  • 75
  • I have left the job that this was an issue for a while back, but in my new position, this is exactly what we have implemented! Cheers. JC – johnc Jan 10 '11 at 20:51
2

Disable file system indexing on your source directories (specifically the obj directories if you want your source searchable)

GeekyMonkey
  • 12,478
  • 6
  • 33
  • 39
2

Perhaps take some common functions and make some libraries, that way the same sources are not being compiled over and over again for multiple projects.

If you are worried about different versions of DLLs getting mixed up, use static libraries.

Adam Pierce
  • 33,531
  • 22
  • 69
  • 89
  • The DLL (and its various cousins likes shared libraries) is almost always a bad idea for an application developer today. Executables are small, even if you link in every last library you use and the amount of memory you save by sharing the code is also small. DLLs hark back to the days when program code footprint in memory and on disk was of key importance, but the size of data, of memory, and of disk have grown much faster than the size of programs. – Tom Swirly Apr 29 '11 at 16:58
2

Turn off VSS integration. You may not have a choice in using it, but DLLs get "accidentally" renamed all the time...

And definitely check your pre-compiled header settings. Bruce Dawson's guide is a bit old, but still very good - check it out: http://www.cygnus-software.com/papers/precompiledheaders.html

Shog9
  • 156,901
  • 35
  • 231
  • 235
  • Certainly we can turn off integration to VSS and drive it through Source Safe UI instead. Nice thought – johnc Sep 11 '08 at 02:15
2

I have a project which has 120 or more exes, libs and dlls and takes a considerable time to build. I use a tree of batch files that call make files from one master batch file. I have had problems with odd things from incremental (or was it temperamental) headers in the past so I avoid them now. I do a full build infrequently, and usually leave it to the end of the day while I go for a walk for an hour (so I can only guess it takes about half an hour). So I understand why that is unworkable for working and testing.

For working and testing I have another set of batch files for each app (or module or library) which also have all the debugging settings in place -- but these still call the same make files. I may switch DEBUG on of off from time to time and also decide on builds or makes or if I want to also build libs that the module may depend on, and so on.

The batch file also copies the completed result into the (or several) test folders. Depending of the settings this completes in several seconds to a minute (as opposed to say half an hour).

I used a different IDE (Zeus) as I like to have control over things like .rc files, and actually prefer to compile from the command line, even though I am using MS compliers.

Happy to post an example of this batch file if anyone is interested.

David L Morris
  • 1,461
  • 1
  • 12
  • 19
1

If it's a C++ project, then you should be using precompiled headers. This makes a massive difference in compile times. Not sure what cl.exe is really doing (with not using precompiled headers), it seems to be looking for lots of STL headers in all of the wrong places before finally going to the correct location. This adds entire seconds to every single .cpp file being compiled. Not sure if this is a cl.exe bug, or some sort of STL problem in VS2008.

Chris O
  • 5,017
  • 3
  • 35
  • 42
1

Does your company happen to use Entrust for their PKI/Encryption solution by any chance? It turns out, we were having abysmal build performance for a fairly large website built in C#, taking 7+ minutes on a Rebuild-All.

My machine is an i7-3770 with 16gb ram and a 512GB SSD, so performance should not have been that bad. I noticed my build times were insanely faster on an older secondary machine building the same codebase. So I fired up ProcMon on both machines, profiled the builds, and compared the results.

Lo and behold, the slow-performing machine had one difference -- a reference to the Entrust.dll in the stacktrace. Using this newly acquired info, I continued to search StackOverflow and found this: MSBUILD (VS2010) very slow on some machines. According to the accepted answer the problem lies in the fact the Entrust handler was processing the .NET certificate checks instead of the native Microsoft handler. Tt is also suggested that Entrust v10 solves this issue that is prevalent in Entrust 9.

I currently have it uninstalled and my build times plummeted to 24 seconds. YYMV with the number of projects you currently are building and may not directly address the scaling issue you were inquiring about. I will post an edit to this response if I can provide a fix without resorting to an uninstallation the software.

Community
  • 1
  • 1
Eric Aho
  • 1
  • 2
1

Looking at the machine that you're building on, is it optimally configured?

We just got our build time for our largest C++ enterprise-scale product down from 19 hours to 16 minutes by ensuring the right SATA filter driver was installed.

Subtle.

JBRWilkinson
  • 4,821
  • 1
  • 24
  • 36
1

There's undocumented /MP switch in Visual Studio 2005, see http://lahsiv.net/blog/?p=40, which would enable parallel compilation on file basis rather than project basis. This may speed up compiling of the last project, or, if you compile one project.

Pavel Radzivilovsky
  • 18,794
  • 5
  • 57
  • 67
1

When choosing a CPU: L1 cache size seems to have a huge impact on compilation time. Also, it is usually better to have 2 fast cores than 4 slow ones. Visual Studio doesn't use the extra cores very effectively. (I base this on my experience with the C++ compiler, but it is probably also true for the C# one.)

darklon
  • 468
  • 3
  • 13
1

I'm also now convinced there is a problem with VS2008. I'm running it on a dual core Intel laptop with 3G Ram, with anti-virus switched off. Compiling the solution is often quite slick, but if I have been debugging a subsequent recompile will often slow down to a crawl. It is clear from the continuous main disk light that there is a disk I/O bottleneck (you can hear it, too). If I cancel the build and shutdown VS the disk activity stops. Restart VS, reload the solution and then rebuild, and it is much faster. Unitl the next time

My thoughts are that this is a memory paging issue - VS just runs out of memory and the O/S starts page swapping to try to make space but VS is demanding more than page swapping can deliver, so it slows down to a crawl. I can't think of any other explanation.

VS definitely is not a RAD tool, is it?

haughtonomous
  • 4,602
  • 11
  • 34
  • 52
1

If this is a web app, setting batch build to true can help depending on the scenario.

<compilation defaultLanguage="c#" debug="true" batch="true" >  

You can find an overview here: http://weblogs.asp.net/bradleyb/archive/2005/12/06/432441.aspx

Daniel Auger
  • 12,535
  • 5
  • 52
  • 73
1

One cheaper alternative to Xoreax IB is the use of what I call uber-file builds. It's basically a .cpp file that has

#include "file1.cpp"
#include "file2.cpp"
....
#include "fileN.cpp"

Then you compile the uber units instead of the individual modules. We've seen compile times from from 10-15 minutes down to 1-2 minutes. You might have to experiemnt with how many #includes per uber file make sense. Depends on the projects. etc. Maybe you include 10 files, maybe 20.

You pay a cost so beware:

  1. You can't right click a file and say "compile..." as you have to exclude the individual cpp files from the build and include only the uber cpp files
  2. You have to be careful of static global variable conflicts.
  3. When you add new modules, you have to keep the uber files up to date

It's kind of a pain, but for a project that is largely static in terms of new modules, the intial pain might be worth it. I've seen this method beat IB in some cases.

Mark
  • 10,022
  • 2
  • 38
  • 41
1

You also may want to check for circular project references. It was an issue for me once.

That is:

Project A references Project B

Project B references Project C

Project C references Project A

Bramha Ghosh
  • 6,504
  • 4
  • 30
  • 29
0

There are a few things that I have found useful for speading up C# /.NET builds:

  • Combine small projects into larger projects as there is a large per project overhead on building a solution. (Use nDepend if needed to control calling across layers)

  • Make all our projects build into the some output directory and then set “copy local” to false on all the project references – this can lead to a large speed up due to reduced IO.

  • Turn of your virus checker to see if it makes much difference; if so find a faster virus checker, or exclude the "hot" folders from the virus checker scanning

  • Use perforce monitor and the sys internal tool to see way your compiles are taking so long.

  • Consider a ram disk to put your output directory on.

  • Consider using a SSD

  • More memory can have a big effect at times – (reduce the ram in the machine if you get a big slow down by removing a little ram, you may get a big speed up by adding more) Remove unneeded project references (you may have to remove unneeded “usings” first)

  • Consider using a dependency injection framework and interfaces for your lowest domain layer, so a recompile of everything is only needed when the interface changes – this may not gain much depending on how often the interface is changed.

Community
  • 1
  • 1
Ian Ringrose
  • 51,220
  • 55
  • 213
  • 317
0

Nice suggestions that have helped so far (not saying there aren't other nice suggestions below, if you are having issues, I recommend reading then, just what has helped us)

  • New 3GHz laptop - the power of lost utilization works wonders when whinging to management
  • Disable Anti Virus during compile
  • 'Disconnecting' from VSS (actually the network) during compile - I may get us to remove VS-VSS integration altogether and stick to using the VSS UI

Still not rip-snorting through a compile, but every bit helps.

We are also testing the practice of building new areas of the application in new solutions, importing in the latest dlls as required, them integrating them into the larger solution when we are happy with them.

We may also do them same to existing code by creating temporary solutions that just encapsulate the areas we need to work on, and throwing them away after reintegrating the code. We need to weigh up the time it will take to reintegrate this code against the time we gain by not having Rip Van Winkle like experiences with rapid recompiling during development.

Orion did mention in a comment that generics may have a play also. From my tests there does appear to be a minimal performance hit, but not high enough to sure - compile times can be inconsistent due to disc activity. Due to time limitations, my tests didn't include as many Generics, or as much code, as would appear in live system, so that may accumulate. I wouldn't avoid using generics where they are supposed to be used, just for compile time performance

johnc
  • 39,385
  • 37
  • 101
  • 139
0

I have found the following helps the compile speed: After repeated compiles (iterative development in memory), I would quit VS2008. Then go into the project directories and delete all the obj and bin folders, start the project back up, and my compile time went way back down. This is similar to the Clean solution action within VS2008.

Just want to confirm if this is useful for anyone out there.

SchmitzIT
  • 9,227
  • 9
  • 65
  • 92
Luke Li
  • 1
  • 1
0

It's sure there's a problem with VS2008. Because the only thing I've done it's to install VS2008 for upgrading my project which has been created with VS2005. I've only got 2 projects in my solution. It isn't big. Compilation with VS2005 : 30 secondes Compilation with VS2008 : 5 minutes

0

I found my fix here: http://blogs.msdn.com/b/vsdteam/archive/2006/09/15/756400.aspx

Adi
  • 5,113
  • 6
  • 46
  • 59
-1

Slow Visual Studio Performance … Solved! September 24th, 2014 by Uzma Abidi

I had an odd performance-related issue today. My Microsoft Visual Studio seemed to be taking far too long to perform even the simplest of operations. I Googled around and tried a few ideas that people had such as disabling add-ins or clearing Visual Studio’s recent projects list but those suggestions didn’t seem to solve the problem. I remembered that the Windows SysInternals website had a tool called Process Monitor that would sniff registry and file accesses by any running program. It seemed to me that Visual Studio was up to something and Process Monitor should help me figure out what it was. I downloaded the most recent version, and after fiddling around a bit with its display filters, ran it and to my horror, I saw that Visual Studio was so slow because it was accessing the more than 10,000 folders in C:\Users\krintoul\AppData\Local\Microsoft\WebSiteCache on most IDE operations. I’m not sure why there were that many folders and moreover, wasn’t sure what Visual Studio was doing with them, but after I zipped those folders up and moved them somewhere else, Visual Studio’s performance improved tremendously.

The Windows SysInternals website has a number of other useful utilities for network management, security, system information and more. Check it out. I’m sure you’ll find something of value.

Andrey Korneyev
  • 26,353
  • 15
  • 70
  • 71