37

My company has been running a large project on Delphi for more than a decade. Our codebase has been growing over the years and now stands at around 4 million lines of code. Compilation speed is becoming an issue. We've spent time weeding out unit circular referencing (a known cause of slow compilation) and examined every aspect of the setup. It gets to a point we can't improve it any further materially with what we can control.

At the moment, on a state of the art PC with 4 cores running Windows XP SP3 and Delphi 2006, start Delphi fresh and do a full build, it takes ~40 secs. Then, if we do another full build in the same Delphi session immediately, it will take 1m 40s. Do another full build again, it will get worse. So on and so forth.

(We are well aware Windows itself caches files and this has a big impact on the compilation speed. The above figures are based on that the files are cached. We set up such scenario by getting Delphi to compile the project once, terminating it and then starting a new Delphi session. So while 40 secs doesn't see to be slow, it's only because the files are cached by Windows. And we do this in order to have an apple-to-apple comparison.)

What puzzles us is why the compilation speed gets worse. (We have observed in the past the slow down was worse if the project had a lot of unit circular referencing.) If we terminate Delphi and start a new session, the compilation time will get back to 40 secs. An even more interesting thing we've observed is, we can achieve the same speed "improvement" by clicking the "Cancel" button to abort the compilation and then do the full build right away. Compilation time will get back to 40secs too.

It appears to us Delphi's own cache of unit dependency isn't as efficient as building it from scratch and it is getting worse over time. And it also appears the Cancel button somehow clears this cache. What we are thinking is, if we can tap into the Delphi IDE subsystem that does this clearing, we can always maintain the compilation speed at its peak performance. But we don't know how.

Does anyone know what we can do?

We are still using Delphi 2006 as we haven't yet found a feasible way to port our large project to Unicode. I read on forums that the latest Delphi XE exhibits similar compilation speed issue with unit circular referencing. Anyone knows if Delphi XE has addressed the problem?

p.s. We are also aware that splitting the project into runtime packages can reduce compilation time. But for deployment and administrative reasons, we try to avoid using runtime packages.

Rob Kennedy
  • 161,384
  • 21
  • 275
  • 467
John
  • 373
  • 3
  • 5
  • 3
    Did you rule-out anti-virus software, indexing software and 3rd party Experts running in the IDE? – Cosmin Prund Jul 05 '11 at 04:45
  • 8
    Have you considered the command line compiler? – J-16 SDiZ Jul 05 '11 at 05:10
  • 8
    http://andy.jgknet.de/blog/ide-tools/delphispeedup/ – RobertFrank Jul 05 '11 at 05:17
  • 1
    Regarding the weeding out of unit references, did you do that manually? If so, you should try out CnPack's Uses Cleaner. *(I haven't had problems with it but a backup of your files before starting wouldn't hurt)* – Lieven Keersmaekers Jul 05 '11 at 06:11
  • 1
    @J-16 The command line compiler is slower than the IDE compiler at least in most Delphi version, unless you use some speed-up specific hack like http://andy.jgknet.de/blog/ide-tools/dcc32speed-12 – Arnaud Bouchez Jul 05 '11 at 06:13
  • @Lieven Are your sure than a cleaner "uses" clause will make the compilation faster? If some units are unused at all, it will compile less code lines, so will be faster. But if the units are just marked in the "uses" clause, not used in this unit, but used in another unit, it won't make any difference IMHO. – Arnaud Bouchez Jul 05 '11 at 06:15
  • @A.Bouchez - no not sure but its a case of *it doesn't hurt trying*. – Lieven Keersmaekers Jul 05 '11 at 06:36
  • 3
    Note that "state of the art PC with 4 cores" isn't really helping much here. The Delphi compiler will only use one core, so you are better off with faster cores. i.e. everything else being equal, take dual 3.33 Ghz over quad 1.8 Ghz. – Chris Thornton Jul 05 '11 at 11:50
  • Have you tried SSD hard drive? – Harriv Jul 05 '11 at 18:11
  • @Lieven - We did the circular unit referencing manually. We tried some tools before but that didn't work. I believe it was because the project was too big for the tools to handle. But we haven't tried CnPack. I'll take a look. Thanks! – John Jul 06 '11 at 01:34
  • @John, it did make a big difference in the past. Look here for a reference (and shameless plug) http://stackoverflow.com/questions/920560/delphi-how-to-organize-source-code-to-increase-compiler-performance/920572#920572 – Lieven Keersmaekers Jul 06 '11 at 06:32
  • Delphi XE still has this bug. What I do, is I restart IDE every 20 minutes (whenever I check my email, take a break, etc). – Gabriel May 10 '14 at 07:43
  • @ChrisThornton - It is true, but if you have a multicore system, the other programs will run in the extra cores leaving (hopefully) a core free for Delphi. – Gabriel May 10 '14 at 10:48

6 Answers6

23

If you build your application, here are some tricks to speed up the process:

  • Erase all *.dcu before the build (del *.dcu /s);
  • Run a good defragmenter on your corresponding hard drive;
  • Put most of your source files in the same directory, and try to leave the IDE and Project library paths as short as possible, with the most used entries at first;
  • Install DelphiSpeedUp.

Delphi 2007 should compile faster than Delphi 2006.

Delphi 2009/2010/XE would probably be slower: from user experiment, the implementation of generics and new RTTI made the compilation process more complex, and the actual implementation was found out to be slower e.g. than with Delphi 2007.

Update:

Did you try enabling the ProjectClearUnitCacheItem hidden menu entry?

Clear Unit Cache entry

I've this entry enabled either by the CnPack, either by DDevExtension (I don't know which one do this, probably the later). This could be used to clear the internal unit cache.

Martin Schneider
  • 14,263
  • 7
  • 55
  • 58
Arnaud Bouchez
  • 42,305
  • 3
  • 71
  • 159
  • 1
    @A.Bouchez What is the unit cache, and what benefits does clearing it bring? – David Heffernan Jul 05 '11 at 08:42
  • @David I used this e.g. when I hacked the VCL source code, for http://synopse.info/forum/viewforum.php?id=6 The compiler does maintain an internal cache for the dcus. For the VCL part, if you modify the VCL source, you'll have to quit then enter the IDE to get the modification take in account on next build. But this would perhaps solve the OP problem, since it seems to be a performance issue within the same IDE session. Since it worked for me as a replacement of IDE quit/restart, it could be the same for John. – Arnaud Bouchez Jul 05 '11 at 09:09
  • For the OP: if clicking "Clear Unit Cache" and then "Build" makes the build fast again, you can write a small expert that installs a "Clear unit cache + Build" in there that automatically does both. – Cosmin Prund Jul 05 '11 at 11:40
  • 5
    @A.Bouchez YES! The hidden "Clear Unit Cache" menu item does the trick! It brings the compilation time back to the initial 40 secs. Excellent! Thanks. (Btw, Delphi took 8 secs to clear the unit cache. Interestingly, Task Manager didn't show any change in memory usage.) – John Jul 06 '11 at 01:47
  • @John Happy you found out your way. I was therefore not a memory or speed issue, but the internal layout of the cache which doesn't scale as expected when a lot of units are involved. Some kind of O(n²) algorithm (e.g. nested loops) should be used internally, and fails to scale as expected. Perhaps worth creating a QC if it's still there in Delphi XE. ;) – Arnaud Bouchez Jul 06 '11 at 05:13
18

The gradual performance degradation could be due to some sort of memory leak or other bug in the compiler. Heaven knows D2005 and D2006 had enough of them! If you can't upgrade to a Unicode-enabled version of Delphi, you ought to at least update to D2007 (which I believe is still available from Embarcadero) for better stability.

Also, as Robert Frank mentioned in a comment, check out Andreas Hausladen's tools. Just a few days ago he released a patch that improves compilation speed quite a bit. Unfortunately, that specific feature is apparently only for D2009 and later, but a lot of his fixes help speed various things up, including the compiler.

Mason Wheeler
  • 82,511
  • 50
  • 270
  • 477
  • I suspect that this is the case. I'd recommend using Process Explorer from SysInternals to see if the memory usage of bds.exe goes up each time the project is compiled. That would confirm the memory leak theory. – Chris Thornton Jul 05 '11 at 12:55
  • Sounds more like a "memory leak conspiracy theory". If you have plenty of RAM, it will take some time before the IDE starting swapping to disk. – Arnaud Bouchez Jul 05 '11 at 13:52
7

It's well worth trying DelphiSpeedUp from Andreas Hausladen but that will only help IDE performance rather than compilation as I understand it.

The other idea that nobody has suggested yet is to use high spec solid state disks.

I recommend using 64 bit Windows 7 with a large amount of RAM for the best file caching performance.

Just be thankful your project isn't written in C++!

David Heffernan
  • 601,492
  • 42
  • 1,072
  • 1,490
  • 1
    + for 64 bit Win7, because a 4 core Windows XP system as the OP mentioned is *not* state of the art, not today! But I'm not so sure about those high spec solid state disks: the OP says he's forcing Windows to cache all files before doing the actual build. I'm not sure better I/O would speed things up. – Cosmin Prund Jul 05 '11 at 06:43
  • @cosmin the caching comments in the post concern making benchmarks more repeatable. Decent IO would reduce the effect of caching. 64 bit win 7 with say 16GB ram would result in a big file cache too. – David Heffernan Jul 05 '11 at 06:53
  • AFAIK DelphiSpeedUp changes low-level RTL functions by faster implementations, including using GetFileAttributesEx instead of much slower FindFirstFile/FindClose API. – Arnaud Bouchez Jul 05 '11 at 07:52
  • @A.Bouchez The Delphi compiler is written in C and doesn't use the RTL, as far as I know. – David Heffernan Jul 05 '11 at 07:55
  • 2
    +1 for the solid state disk. I'm running a new machine I've built with 300G Crucial SSD and 1TB WD HD under W7 64-bit. All working Delphi projects are located and built on the SSD. 1 million lines of code builds first time in D7 and XE at 11s, second time at 8s. SSD transfer rate is 200-300 MB/s against the 1TB disk's 40 B/s-ish. – Brian Frost Jul 05 '11 at 11:27
  • 3
    @Brian I don't think the transfer rate is the key factor here, rather the latency to start a transfer. – David Heffernan Jul 05 '11 at 11:36
  • 2
    @David I was talking about the C RTL, which has also been hacked by Andreas. He did not only patch the IDE part. See e.g. what he made to the poor unwilling command line compiler: http://andy.jgknet.de/blog/ide-tools/dcc32speed-12 – Arnaud Bouchez Jul 05 '11 at 13:49
  • 1
    @David Heffernan - While using solid state drives doesn't address the issue with the gradual degradation of compilation speed (the main issue we have), we do consider it as a parallel solution to speed up compilation (when the pas files are not in Windows cache). Thanks. – John Jul 06 '11 at 01:57
1

Consider building with runtime packages in-house, then building monolithic executables when sending code to a QA department or distributing your application.

This requires extra maintenance, but the dramatic increases in build times are worth it IMO.

We have a 2.4 MLOC project with about 40-50 smaller, supporting applications. When compiled against a group of runtime packages the project builds in about 500K lines and builds about 6 times faster (15 seconds vs. 90 seconds). Many of the smaller applications compile in one second or less because so much of the packaged code is shared.

You need to be sure to test the monolithic executable, not the packaged executable. But generally you shouldn't see too many behavior differences if you follow good coding practices.

David Robb
  • 11
  • 1
  • We have considered what you suggested. But we are still not sure if the overhead and risk introduced outweighs the benefit. – John Jul 06 '11 at 01:50
  • That is understandable and I agree. It is particularly difficult to maintain dual compilations. Build Configurations help a lot. There is another potential benefit that I didn't mention before: because packages cannot recursively depend upon each other, packaged builds enforce good software layering. E.g. low level code cannot be statically linked to higher-level code. That alone is reason to at least organize code into runtime packages. – David Robb Jul 06 '11 at 15:30
0

Did you try to compile the code using a script command line?

Did recompiling from command line made the process stand on 40 seconds?

run from cmd "dcc32.exe" to see usage.

Update: I can't check it now, however you should try compiling from command line and see if you try to run from the ide, the ide should not recompile, and lets you run with debug.

none
  • 4,669
  • 14
  • 62
  • 102
  • On practice, I found out the command line compiler to be slower than the IDE compiler at least in most Delphi versions, unless you use some speed-up specific hack like andy.jgknet.de/blog/ide-tools/dcc32speed-12 – Arnaud Bouchez Jul 05 '11 at 12:24
  • 1
    We do use the command line to do the final builds for production deployment. Yes, it gives pretty consistent 40sec compilation time. But don't you lose debugging in the IDE? (I don't know if we have hacks in place though. I need to check with my colleague to find out.) – John Jul 06 '11 at 01:54
0

This question have some more advice to get better compilation speed. Avoiding circular references and detect unused units (with CnWizards) are of greater effect.

Community
  • 1
  • 1
Fabricio Araujo
  • 3,810
  • 3
  • 28
  • 43