As a code base grows, so does build time. Long build times are a bad thing. Well, they are a bad thing for developer productivity but in my case a good thing for my stackoverflow rep.
A little under ten years ago I took a contract working on a desktop app. for an equity portfolio prop desk. The requirement wasn’t particularly complex, pull in data from a few different sources, run some basic analytics, render some pretty grids/charts etc., simple. The implementation was crazy; 50+ projects in a single solution (VS2002) and a mixture of C# and VB.NET (the London dev team were C# hacks, NYC were VB guys..the two teams didn’t get on). VS2002 did a poor job of working out build order (*VS2010 is still not that clever as I found out this week) and so built some projects many times over resulting in a build time of circa 15 minutes on what at the time was a high end PC workstation. The solution was to build with a Nant script and so explicitly stipulate the build order. This got the build time down to 3 minutes, still less than ideal, but not a total disaster.
Fast forward to 2011 and I’m working with a another fairly simple (in terms of architecture at least) monolithic web application. However, there is lots of it. Literally thousands of ASPX pages, even more controls, and a SQL schema with nearly 600 tables and therefore nearly six hundred entity classes not to mention a hefty service layer. In total there are just over 180 VS projects. The developer machines are mid-range desktop models from about 3 years or so ago. VS solutions are aggregated along functional lines, so the build process is as follows:
- Get latest version and build the entire code base using an in-house bespoke and application specific build tool. 10 minutes.
- Build it all again because the aforementioned build tool doesn’t work out the build order correctly and needs a second pass. 10 minutes.
- Open the solution for the ‘module’ being developed. Keep build tool running to copy new build assemblies to application for debugging.
A long ramp up but once you have the fire stoked you can get things done. But, now you want to re-factor part of the core application framework. Oh dear. Resharper nor any other tool is going to help you here, so that’s 60 or so solutions you need to compile-fix-repeat once you have refactored the framework, disaster. Why not have a solution that includes all projects just for that purpose, but use the sub-set solutions for development along functional rather than platform lines. That’s a non starter for two reasons, 1) ever tried to open a solution with 180 projects on a low spec. machine? Ouch. 2) even if you have the patience of a saint and the wind is blowing in the right direction, the projects reference each other via binary refs not project refs. Big Fat FAIL.
- Upgrade hardware; 8 core 64Bit i7s with RAID and 8Gb RAM. Nice. (I intend on installing some SDD’s in mine and increasing the RAM up to 16Gb but then I’m the sort of chap that can happily read custom PC for hours).
- Create a single solution to which all projects belong and change the inter-project references to project refs instead of binary refs. This is controversial. It fixes one problem in that core refactoring is now feasible without getting RSI, but build time is still 5 minues even with the new workstations, don’t panic, there’s more…
- Build a generic (as in not app. specific) parallel build tool that reduces the build time to 3 minutes **
- Further enable build concurrency by implementing an IoCContainer to decouple client code from implementation.
- Observe complete builds of under one minute. Builds of only changed code and it’s dependants in mere seconds.
So, the hardware upgrade was a long overdue prerequisite. The single solution was one step forward, one step back. The parallel build tool was fun to build and I’m looking forward to releasing it open source, but the real killer move was the decoupling of code. There is another step that can be taken to decrease build time: amalgamate the 60 or so web projects that contain ASPX/ASCX/JS files etc. Currently they are copied to a common target post build, but now there is single solution and they are only ever used in one application this is unnecessary. It would also have the happy side effect of a faster VS solution load time, which is rather lack lustre with Resharper enabled.
Question: Is rapid build time a valid design goal?
Answer: Yes, absolutely. TDD has a profound impact on code architecture/design. Build time is another important factor in terms of development productivity and must be a consideration from the outset. It is perfectly valid and desirable to make code architecture/design decisions purely to enable fast builds.
If you are making tea for a 100 people, you get an urn, you don’t boil a kettle. So much for my measly stackoverflow reputation.
* I upgraded a 10 project solution with C# and VB.NET mix from VS2002 through to VS2010 and it didn’t get the build order right, I had to help it out by manually specifying project dependencies.
** MSBuild already supports parallel builds, but not like this build tool soon to be released on an MIT open source license @ BitBucket…stay tuned.