Monday, 14 July 2014

Implementing DotCover from Powershell

We've just made the jump to JetBrains DotCover, and I'm reasonably impressed by the command line implementation. I think its worth mentioning that the Developers's are very happy with the Visual Studio integration, but that's not a part of the world I inhabit so I couldn't comment.

Supporting DotCover in our build pipeline was almost painless. The only part where I struggled was on the DotCover command line documentation. It took me a good while to realise that TargetArguments is where I specify the command line arguments used by the test-runner. In our case, the actual MSTest command line arguments.

Excuses aside, here's my Powershell invocation of JetBrains DotCover.

At present, we do not use the XML configuration files. But, when Developers start producing their own XML configurations within each visual studio solution, I shall implement that functionality later.
Instrumentation & Test

We automatically exclude the test assembly namespace from the coverage analysis.
  • We're not interested in how well our tests cover our tests.
  • i.e. com.product.domain.project.tests 
The scope of the coverage analysis is limited to only the project under examination.
  • Any unintentional coverage of other projects is discarded.
  • i.e. com.product.domain.project 
Individual coverage reports are merged into a domain report last
  • i.e. com.product.domain

$testRunner = "C:\Program Files (x86)\Microsoft Visual Studio 12.0\Common7\IDE\MSTest.exe"
$testContainers = "D:\org\branch\namespace\domain\project.tests\bin\Output\Org.Namespace.Domain.Project.Tests.dll","D:\org\branch\namespace\domain\project.tests\bin\output\Org.Namespace.Domain.Project.Tests.dll"
foreach($test in $testContainers)
{
   $testAssembly = Get-Item $test
   $namespace = $testAssembly.BaseName -replace ".tests", ""
   $testName = $testAssembly.Name
   $testDirectory = (Split - Path $test - Parent)
   $testReport = "C:\temp\$testName.trx" & $coverageTool cover /TargetExecutable = "$testRunner" / Filters = "-:$namespace.Tests;+:class=$namespace*" /TargetArguments = "/testContainer:$test /resultsFile:$testReport" / Output = "C:\temp\$testName.dcvr" / LogFile ="C:\DotCover.log.txt"
}


Report Merging

The next steps merge up all the individual coverage reports generated by the unit test runs. This gives us our test coverage for the domain : com.product.domain.project.
$testReports = $testContainers | % {
   $name = Split-Path $_ -Leaf
   return ("c:\temp\{0}.dcvr" -f $name)
}

$testReportArgument = [String]::Join(";", $testReports)
& $coverageTool merge /Source="$testReportArgument" /Output="C:\temp\mergedSnapshots.dcvr"

Report Generation

The unified domain coverage reports can now be transformed into useful reports
HTML Report
Handy for quick review of coverage issues. We archive these reports in the build artefacts for the purposes of providing quick lookups. The reports persist until removed by the TFS retention policies.

& $coverageTool report /Source="C:\temp\mergedSnapshots.dcvr" /Output="C:\temp\mergedReport.html" /ReportType="HTML"

XML Report

Ideal format for the next stage where I extract the coverage metrics to complete the code quality assessment in the build pipeline.

& $coverageTool report /Source="C:\temp\mergedSnapshots.dcvr" /Output="C:\temp\mergedReport.xml" /ReportType="XML"

& $coverageTool report /Source="C:\temp\mergedSnapshots.dcvr" /Output="C:\temp\mergedReport.xml" /ReportType="XML"

Summary Extraction

The XML report provides a lot of information, but for the purposes of "Pass or Fail", I just need the combined coverage percentage. If it's falls below our agreed threshold, then the build must fail.

[xml]$coverageAnalysis = Get-Content "C:\temp\$mergedReport.xml"

$blocksCovered = $coverageAnalysis.Root.CoveredStatements;

$totalBlocks = $coverageAnalysis.Root.TotalStatements;

$totalCoverage = $coverageAnalysis.Root.CoveragePercent;




Thursday, 10 July 2014

Enriching our build tweets

We have several Team Foundation projects now running concurrently, and switching between these projects and work-spaces in Visual Studio 2013 can be time consuming and frustrating chore.

For ease of use, I prefer to aggregate my build messages into a single location. In the past I've used the SonarQube build management to aggregate information about builds in a single repository, and I probably will again at some point in the future. But, at the moment we're using a hidden twitter account to push all status messages into.



The Tweet gives us a quick over-view of the build outcome.

We've added a BitLy link to each tweet that references an internal HTTP server that gives us quick access to the build transcript generated by TFS.

How we used Powershell to talk to Twitter is covered in an older post, but adding a Bit.Ly link is demonstrated below:

$username = "-----------"
$apiKey = "-------------------------------" # Legacy API Key

Function Get-ShortURL {
Param($longURL)
$url = "http://api.bit.ly/shorten?version=2.0.1&format=xml&longUrl=$longURL&login=$username&apiKey=$apiKey"
$request = [net.webrequest]::Create($url)
$responseStream = new-object System.IO.StreamReader($request.GetResponse().GetResponseStream())
$response = $responseStream.ReadToEnd()
$responseStream.Close()

$result = [xml]$response
Write-Output $result.bitly.results.nodeKeyVal.shortUrl
}


Get-ShortURL "http://server.local/Build26098.txt" 


Finding satellite and indirect assembly references during the build

Over the past few years, I've posted on several occasions about missing implicit references.

The solutions I came up with for each of these posts were simply managing the symptoms of a hidden problem. But, we couldn't quite put our finger on the real underlying problem. 

After we switched over to NuGet packages to improve our feedback cycles, the problem presented itself more frequently. Everything came to a head when a key facet of our framework assembly was unable to function on application start-up. At this point, we had to find out what the real problem was and we had a new symptom to work with. 

It seemed that core functionality that was supplied by satellite (indirectly referenced) assemblies was unable to function, as during the build process MSBuild was not able to recognize the need for and/or locate the required satellite assemblies and so didn't copy them into the output folder.

After scrabbling around on the internet for a good half a day, we found this genius MSBuild Custom Target, that used .NET reflection during the AfterBuild stage to grab any and all indirect references.

<Target Name="AfterBuild">
    <!-- Here's the call to the custom task to get the list of dependencies -->
    <ScanIndirectDependencies StartFolder="$(MSBuildProjectDirectory)" StartProjectReferences="@(ProjectReference)" Configuration="$(Configuration)">
        <Output TaskParameter="IndirectDependencies" ItemName="IndirectDependenciesToCopy" />
    </ScanIndirectDependencies>

    <!-- Only copy the file in if we won't stomp something already there -->
    <Copy SourceFiles="%(IndirectDependenciesToCopy.FullPath)" DestinationFolder="$(OutputPath)" Condition="!Exists('$(OutputPath)\%(IndirectDependenciesToCopy.Filename)%(IndirectDependenciesToCopy.Extension)')" />
</Target>


<!-- THE CUSTOM TASK! -->
<UsingTask TaskName="ScanIndirectDependencies" TaskFactory="CodeTaskFactory" AssemblyFile="$(MSBuildToolsPath)\Microsoft.Build.Tasks.v$(MSBuildToolsVersion).dll">
    <ParameterGroup>
        <StartFolder Required="true" />
        <StartProjectReferences ParameterType="Microsoft.Build.Framework.ITaskItem[]" Required="true" />
        <Configuration Required="true" />
        <IndirectDependencies ParameterType="Microsoft.Build.Framework.ITaskItem[]" Output="true" />
    </ParameterGroup>
    <Task>
        <Reference Include="System.Xml" />
        <Using Namespace="Microsoft.Build.Framework" />
        <Using Namespace="Microsoft.Build.Utilities" />
        <Using Namespace="System" />
        <Using Namespace="System.Collections.Generic" />
        <Using Namespace="System.IO" />
        <Using Namespace="System.Linq" />
        <Using Namespace="System.Xml" />
        <Code Type="Fragment" Language="cs">
          <![CDATA[
var projectReferences = new List<string>();
var toScan = new List<string>(StartProjectReferences.Select(p => Path.GetFullPath(Path.Combine(StartFolder, p.ItemSpec))));
var indirectDependencies = new List<string>();

bool rescan;
do{
  rescan = false;
  foreach(var projectReference in toScan.ToArray())
  {
    if(projectReferences.Contains(projectReference))
    {
      toScan.Remove(projectReference);
      continue;
    }

    Log.LogMessage(MessageImportance.Low, "Scanning project reference for other project references: {0}", projectReference);

    var doc = new XmlDocument();
    doc.Load(projectReference);
    var nsmgr = new XmlNamespaceManager(doc.NameTable);
    nsmgr.AddNamespace("msb", "http://schemas.microsoft.com/developer/msbuild/2003");
    var projectDirectory = Path.GetDirectoryName(projectReference);

    // Find all project references we haven't already seen
    var newReferences = doc
          .SelectNodes("/msb:Project/msb:ItemGroup/msb:ProjectReference/@Include", nsmgr)
          .Cast<XmlAttribute>()
          .Select(a => Path.GetFullPath(Path.Combine(projectDirectory, a.Value)));

    if(newReferences.Count() > 0)
    {
      Log.LogMessage(MessageImportance.Low, "Found new referenced projects: {0}", String.Join(", ", newReferences));
    }

    toScan.Remove(projectReference);
    projectReferences.Add(projectReference);

    // Add any new references to the list to scan and mark the flag
    // so we run through the scanning loop again.
    toScan.AddRange(newReferences);
    rescan = true;

    // Include the assembly that the project reference generates.
    var outputLocation = Path.Combine(Path.Combine(projectDirectory, "bin"), Configuration);
    var localAsm = Path.GetFullPath(Path.Combine(outputLocation, doc.SelectSingleNode("/msb:Project/msb:PropertyGroup/msb:AssemblyName", nsmgr).InnerText + ".dll"));
    if(!indirectDependencies.Contains(localAsm) && File.Exists(localAsm))
    {
      Log.LogMessage(MessageImportance.Low, "Added project assembly: {0}", localAsm);
      indirectDependencies.Add(localAsm);
    }

    // Include third-party assemblies referenced by file location.
    var externalReferences = doc
          .SelectNodes("/msb:Project/msb:ItemGroup/msb:Reference/msb:HintPath", nsmgr)
          .Cast<XmlElement>()
          .Select(a => Path.GetFullPath(Path.Combine(projectDirectory, a.InnerText.Trim())))
          .Where(e => !indirectDependencies.Contains(e));

    Log.LogMessage(MessageImportance.Low, "Found new indirect references: {0}", String.Join(", ", externalReferences));
    indirectDependencies.AddRange(externalReferences);
  }
} while(rescan);

// Expand to include pdb and xml.
var xml = indirectDependencies.Select(f => Path.Combine(Path.GetDirectoryName(f), Path.GetFileNameWithoutExtension(f) + ".xml")).Where(f => File.Exists(f)).ToArray();
var pdb = indirectDependencies.Select(f => Path.Combine(Path.GetDirectoryName(f), Path.GetFileNameWithoutExtension(f) + ".pdb")).Where(f => File.Exists(f)).ToArray();
indirectDependencies.AddRange(xml);
indirectDependencies.AddRange(pdb);
Log.LogMessage("Located indirect references:\n{0}", String.Join(Environment.NewLine, indirectDependencies));

// Finally, assign the output parameter.
IndirectDependencies = indirectDependencies.Select(i => new TaskItem(i)).ToArray();
      ]]>
        </Code>
    </Task>
</UsingTask>


With this custom target in place, our output folders became much larger, containing all the indirect references that had previously been lost.

There was another blog post we found that takes the original idea and expands upon the satellite reference discovery to go even deeper - but this wasn't needed in our case.

<?xml version="1.0" encoding="utf-8"?>
<Project xmlns="http://schemas.microsoft.com/developer/msbuild/2003" ToolsVersion="12.0" DefaultTargets="Build">
<!-- All the stuff normally found in the project, then in the AfterBuild event... -->
<Target Name="AfterBuild">
    <!-- Here's the call to the custom task to get the list of dependencies -->
    <ScanIndirectDependencies StartFolder="$(MSBuildProjectDirectory)" StartProjectReferences="@(ProjectReference)" Configuration="$(Configuration)">
        <Output TaskParameter="IndirectDependencies" ItemName="IndirectDependenciesToCopy"/>
    </ScanIndirectDependencies>
    <!-- Only copy the file in if we won't stomp something already there -->
    <Copy SourceFiles="%(IndirectDependenciesToCopy.FullPath)" DestinationFolder="$(OutputPath)" Condition="!Exists('$(OutputPath)\%(IndirectDependenciesToCopy.Filename)%(IndirectDependenciesToCopy.Extension)')"/>
</Target>
<!-- THE CUSTOM TASK! -->
<UsingTask TaskName="ScanIndirectDependencies" TaskFactory="CodeTaskFactory" AssemblyFile="$(MSBuildToolsPath)\Microsoft.Build.Tasks.v12.0.dll">
    <ParameterGroup>
        <StartFolder Required="true"/>
        <StartProjectReferences ParameterType="Microsoft.Build.Framework.ITaskItem[]" Required="true"/>
        <Configuration Required="true"/>
        <IndirectDependencies ParameterType="Microsoft.Build.Framework.ITaskItem[]" Output="true"/>
    </ParameterGroup>
    <Task>
        <Reference Include="System.Xml" />
<Using Namespace="Microsoft.Build.Framework" />
<Using Namespace="Microsoft.Build.Utilities" />
<Using Namespace="System" />
<Using Namespace="System.Collections.Generic" />
<Using Namespace="System.IO" />
<Using Namespace="System.Linq" />
<Using Namespace="System.Xml" />
<Code Type="Fragment" Language="cs">
      <![CDATA[
var projectReferences = new List<string>();
var toScan = new List<string>(StartProjectReferences.Select(p => Path.GetFullPath(Path.Combine(StartFolder, p.ItemSpec))));
var indirectDependencies = new List<string>();

bool rescan;
do{
  rescan = false;
  foreach(var projectReference in toScan.ToArray())
  {
    if(projectReferences.Contains(projectReference))
    {
      toScan.Remove(projectReference);
      continue;
    }

    Log.LogMessage(MessageImportance.Low, "Scanning project reference for other project references: {0}", projectReference);

    var doc = new XmlDocument();
    doc.Load(projectReference);
    var nsmgr = new XmlNamespaceManager(doc.NameTable);
    nsmgr.AddNamespace("msb", "http://schemas.microsoft.com/developer/msbuild/2003");
    var projectDirectory = Path.GetDirectoryName(projectReference);

    // Find all project references we haven't already seen
    var newReferences = doc
          .SelectNodes("/msb:Project/msb:ItemGroup/msb:ProjectReference/@Include", nsmgr)
          .Cast<XmlAttribute>()
          .Select(a => Path.GetFullPath(Path.Combine(projectDirectory, a.Value)));

    if(newReferences.Count() > 0)
    {
      Log.LogMessage(MessageImportance.Low, "Found new referenced projects: {0}", String.Join(", ", newReferences));
    }

    toScan.Remove(projectReference);
    projectReferences.Add(projectReference);

    // Add any new references to the list to scan and mark the flag
    // so we run through the scanning loop again.
    toScan.AddRange(newReferences);
    rescan = true;

    // Include the assembly that the project reference generates.
    var outputLocation = Path.Combine(Path.Combine(projectDirectory, "bin"), Configuration);
    var localAsm = Path.GetFullPath(Path.Combine(outputLocation, doc.SelectSingleNode("/msb:Project/msb:PropertyGroup/msb:AssemblyName", nsmgr).InnerText + ".dll"));
    if(!indirectDependencies.Contains(localAsm) && File.Exists(localAsm))
    {
      Log.LogMessage(MessageImportance.Low, "Added project assembly: {0}", localAsm);
      indirectDependencies.Add(localAsm);
    }

    // Include third-party assemblies referenced by file location.
    var externalReferences = doc
          .SelectNodes("/msb:Project/msb:ItemGroup/msb:Reference/msb:HintPath", nsmgr)
          .Cast<XmlElement>()
          .Select(a => Path.GetFullPath(Path.Combine(projectDirectory, a.InnerText.Trim())))
          .Where(e => !indirectDependencies.Contains(e));

    Log.LogMessage(MessageImportance.Low, "Found new indirect references: {0}", String.Join(", ", externalReferences));
    indirectDependencies.AddRange(externalReferences);
  }
} while(rescan);

// Expand to include pdb and xml.
var xml = indirectDependencies.Select(f => Path.Combine(Path.GetDirectoryName(f), Path.GetFileNameWithoutExtension(f) + ".xml")).Where(f => File.Exists(f)).ToArray();
var pdb = indirectDependencies.Select(f => Path.Combine(Path.GetDirectoryName(f), Path.GetFileNameWithoutExtension(f) + ".pdb")).Where(f => File.Exists(f)).ToArray();
indirectDependencies.AddRange(xml);
indirectDependencies.AddRange(pdb);
Log.LogMessage("Located indirect references:\n{0}", String.Join(Environment.NewLine, indirectDependencies));

// Finally, assign the output parameter.
IndirectDependencies = indirectDependencies.Select(i => new TaskItem(i)).ToArray();
]]>
        </Code>
    </Task>
</UsingTask>
</Project>


All of the missing assembly reference issues we'd previously had were gone. At this point, we could remove all the explicit test assembly references and the run-time discovery framework worked entirely as it should.

Monday, 12 May 2014

Code Contracts and conflicting satellite assemblies

Code Contracts : Conflicting assemblies

I've found that the code contracts compiler is a bit on the sensitive side. 

We recently resolved an issue with the code contracts compiler not being able to locate satellite assemblies.

The problem

We recursively build up the search path in MSBuild so that the CC compiler can resolve satellite assemblies, but in the case of of two assemblies having the same name, the CC compiler still throws up an error relating to a conflicting assembly.

The solution

At the moment, our solution is to rename or remove the conflicting assembly from the search path. 

Code Contracts failing to find satellite references

Code Contracts : Could not resolve type reference

I've found that the code contracts compiler is a bit on the sensitive side. 

We recently re-factored our build pipeline to use pre-compiled NuGet modules,whereby we import NuGet packages of pre-compiled segments of our platform, rather than rebuilding each time.

Interestingly, this project gave me some insight into auto-magic performed by MSBuild in helping us get things built. In this instance, it would seem that MSBuild must maintain a list of search paths for assemblies. Each output or reference it encounters perhaps joining the list as it goes along. 

In short, MSBuild was previously filling in the gaps for us, gaps we didn't know existed. It was automatically finding matching assemblies from its dynamic search path. It transpired that we had many 'implicit' references to satellite assemblies that only held true during a full build. These assemblies were not included in our NuGet packages, and so we would see an error such as this:

The problem

These missing libraries were nearly always in our shared libraries folder. 

Target CodeContractRewrite:

"C:\Program Files (x86)\Microsoft\Contracts\Bin\ccrewrite.exe" "@Org.Namespace.Solution.DomainCCRewrite.rsp"

Reading assembly '----.-----.-----.----' from '-----.-----.-----.Repositories.dll' resulted in errors.

Assembly reference not resolved: EntityFramework, Version=6.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089.

Could not resolve type reference: [EntityFramework]System.Data.Entity.Core.Objects.IObjectSet`1.
Could not resolve type reference: [EntityFramework]System.Data.Entity.Core.Objects.ObjectStateEntry.
Could not resolve type reference: [EntityFramework]System.Data.Entity.Core.Objects.ObjectQuery`1.
ccrewrite : error : Rewrite aborted due to metadata errors. Check output window

The Solution

We created a custom target, that would furnish MSBuild with all the search paths for our Libraries folder. This would enable it to match on satellite assemblies that it couldn't find itself.

<?xml version="1.0" encoding="utf-8"?>
<Project ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
  <Target Name="BeforeResolveReferences">
    <ItemGroup>
      <LibrariesSearchPathFiles Include="$(MSBuildThisFileDirectory)..\..\..\Libraries\**\*.dll" />
    </ItemGroup>
    <Message Text="Adding Libraries directory to assembly search paths..." />
    
    <RemoveDuplicates Inputs="@(LibrariesSearchPathFiles->'%(RootDir)%(Directory)')">
      <Output TaskParameter="Filtered" ItemName="LibrariesSearchPath" />
    </RemoveDuplicates>

    <CreateProperty Value="$(AssemblySearchPaths);@(LibrariesSearchPath)">
      <Output TaskParameter="Value"
          PropertyName="AssemblySearchPaths" />
    </CreateProperty>
<CreateProperty Value="$(CodeContractsLibPaths);@(LibrariesSearchPath)">
      <Output TaskParameter="Value"
          PropertyName="CodeContractsLibPaths" />
    </CreateProperty>
  </Target>
</Project>


And, in every .CSPROJ that was throwing up the CodeContractRewrite error, we would add:

<Import Project="$(ProjectDir)..\..\..\ALM\Build\targets\Libraries.targets" />

After this, the Code Contracts compiler had access to a search path that contained all our library items.

Sunday, 11 May 2014

Faster builds using NuGet


It. Takes. Too. Long! :@

Continuous Improvement, like history, is just one thing after another, and whilst we've made great strides along our process maturity road-map, there's still inevitably always room for more improvement. 

It takes ages to create a new feature branch.

Our repository is huge, it has a disk foot print of approximately 9GB. Each branch is therefore also 9GB. It doesn't take TFS particularly long to create the new branch, the delay is what happens next:

  • Downloading the repository takes about of 10 minutes.
  • A full get has a detrimental impact on the TFS (virtual) server affecting other developers

The process of creating new features is entirely scripted in Powershell, so the procedural aspect has been automated; The developers are however still inconvenienced by the lengthy 'wait' for completion.

It takes ages to build our new feature (building dependencies)

Once the "Get All" is complete, there is another wait of about 20 minutes for the entire product to be built locally so that all the referenced assemblies are available within the workspace.

The architecture of the product demands that its built in this way. We can't decouple as this is by design, it's meant to work this way. Its a fine design conceptually, operationally and architecturally. It is however, a nightmarish web of inter-dependency to build.

It takes ages to publish to a test environment

Once our new feature is branched and built it needs to be published so that its web services can be consumed and the messaging system can function. This takes a further 20 minutes as its not a trivial task to deploy over 100 websites and databases. 

Let's have a go at making things better?

As a first pass at improving this situation we attempted to split the team along the traditional divide of skill-set, a separation of the platform and UI.

The two distinct skill-sets with a clear division of labor presented us with an opportunity to try and split the product.

The UI team's solutions are typically consumers of the platform, so in the build pipeline they are the last to be built and packaged. Meaning that the UI devs have to wait for the platform to be built first.  Using NuGet we were able to directly embed these assemblies into the UI solutions.

The platform builds published the platforms API to our own hosted Nuget repository which was then immediately available to the UI team for consumption.

The results were pretty pleasing:
  1. UI tier builds were reduced to under 1 minute. 
    From a typical wait of 8 minutes.
  2. Platform builds, free of the UI, were reduced by a couple of minutes.
    From a typical wait of 8 minutes to 6.
The division along these lines was working reasonably well for the UI team. But platform developers and so-called "full-stack" developers were not feeling the love. 

Let's try again...

The biggest issue with the division we'd created was the associated changes in working practices.

Each feature now required two branches, one for each team. The NuGet packages provided the link between Platform and UI branches.

This kind of worked but for..
  1. Two lots of "Feature Branch Creation" were needed
  2. "Full-Stack" developers now had two sets of branches to manage, with an un-welcome partition between the two.
  3. Reverse integrations from both API and UI branches required a degree of coordination and communication. 
  4. We had some solutions that had a foot in both camps e.g. back-office admin sites.
  5. Deployments to Production required a rebuild of the UI branch to ensure it was using the new target Platform assemblies.
But we had learnt much from our first attempt...
  • How to perform on-the-fly Nuget substitution using Powershell.
  • How to reverse the NuGet substitutions back into relative file references.
  • How to upgrade and downgrade Nuget versions.
  • How to host our own Nuget Repository.
  • How to programatically cloak and un-cloak sections of the local workspace.
  • How to programatically cloak sections of the build definition.
  • Improved dependency discovery.
During the discussion an idea was born of the nascent techniques developed during our first attempt. Instead of separating the teams as we had done, let's try and separate the feature instead. Let's only take what we need from the repository and use NuGet to plug the remaining dependency gaps in the workspace.

Partial builds and Isolating the feature

We've used NuGet to facilitate a situation where the developers can be select which domains they intend to change, and use NuGet to reference the domains they leave behind. When I say leave behind, I mean simply absent from the local workspace.

By actively managing the local workspace mappings we can hide-away the domains we're not changing.  The end result of this is that developers of any specialisation, or generalisation, can work on a feature in a workspace tailored to their exact needs. 

Here's an extract from our internal training documentation for new developers.  This visual aid hopefully illustrates the difference between a partial build using selective workspace mapping,  and a full-output build.




The benefits of partial builds

When we look at the list of benefits, it's embarrassing that we didn't think of this sooner. 
  • A branch with selective mapping can take as little as 10 seconds to download
    • An improvement on 10 minutes
    • Workspaces are refreshed multiple times during the lifetime of a feature, so the savings are compounded over time.
  • Only one feature branch is required
    • Negates the need for the creation and management of multiple branches
    • Coordinated reverse integration is not required
  • Build & feedback times are closer to 30 seconds
    • Feedback is obtained far quicker
    • Quicker builds improves wait-times on the TFS build agents for the entire team
  • The NuGet provides a stable code-base for feature development
    • Features can be based upon solid foundations
    • Features can be based upon previous code releases, current production, or the latest green build. Or, if necessary, another feature branch.
  • Lighter infrastructure footprint
    • Less network traffic and disk usage (<500MB versus 10GB)
    • Faster builds, shorter queues, and a significantly smaller artefacts repository
    • Reduced TFS contention.
    • Deployments of features over the WAN link to Azure are much quicker.
  • Less management
    • Simpler and smaller branches are easier to reverse integrate with the main.
    • The smaller changesets and workspaces vastly improve the performance of TFS and Visual Studio Team Explorer.
    • Check-In Policy is working with smaller changesets and is nimbler.
  • Promiscuous 'tweaks' branch gone
    • We had to use an promiscuous branch for releasing quick-fixes between release periods. Mostly UI and other visual tweaks. This was a bit of an administrative nuisance, which is now, thankfully, gone. 

Revised lifecycle management

In order to make this work the developers have been provided with a set of Powershell based scripts that perform all the heavy lifting. Crucially the scripts follow a Q&A approach to ascertaining the developers requirements before acting.

Starting a new feature

Defining a new feature, entails created an isolated feature branch on the developers local workspace, with only the solutions of interest present.
    1. Gather requirements from the developer
    2. Create the new branch on TFS
    3. Selective workspace mapping in the local workspace
      1. Perform Get after mappings declared
    4. Create a new CI build definition on the TFS build service
      1. Source cloaking to match developers workspace
    5. Perform NuGet substitution on absent solutions
      1. Download the required packages from NuGet
      2. Replace references to absent solutions with NuGet
      3. Manage the workspaces NuGet configurations
At the end of this process, which takes about 30s to complete, the developers have a new branch within their workspace which contains only the domains requested. All the other domains swapped for NuGet references from the 'green build' selected.

New Feature >  Confirm your choices

You would like to create a new feature as described?:
 Create a new branch called 'FeatureX'
 Reference all excluded domains with packages from build 23070
 The TFS path '$/Platform/Sprints/FeatureX' is mapped locally as 'E:\Projects\Platform\Sprints\FeatureX'

 You have chosen to add the following into your local workspace:

Core
Core.ACL
Core.Content
Ui.Content
Ui.Core
Ui.Models
Ui.Sites.Primary

Mapping $/Org/Platform/Sprints/Core
Mapping $/Org/Platform/Sprints/Core/ACL
Mapping $/Org/Platform/Sprints/Core.Content
Mapping $/Org/Platform/Sprints/UI/Content
Mapping $/Org/Platform/Sprints/UI/Core
Mapping $/Org/Platform/Sprints/UI/Models
Mapping $/Org/Platform/Sprints/UI/Sites/Primary

Reference all excluded domains with packages from build 23070

Are you sure?
[Y] Yes  [N] No  [?] Help (default is "N"): y

Build Platform

Performed more frequently than any other action, builds will be performed on the solutions isolated in the feature branch. Dependency discovery is still important, as there is still a build order to be observed for those solutions which are present.
    1. Gather requirements from the developer
    2. Offer opportunity to update NuGet references to latest 'green build'
      1. Dependency discovery
    3. Start build

Get Feature

Occasionally, another developer may wish to collaborate on a feature already under development. In this instance the new developers need to recreate the same workspace configuration within their own environment. The TFS build definition has its own set of source mappings that recreate the same level of isolation on the CI server, so this can be queried to extrapolate the solutions of interest.
  1. Gather requirements from the developer
  2. Inspect the identified TFS build definition
    1. Identify which domains the feature is using
    2. Recreate the matching TFS mappings into the developers own local workspace
    3. Perform a Get
  3. Start build

Reverse Integration

Reverse integration is very straight forward. The heavily cloaked feature branch only contains a small isolated group of solutions, so it takes TFS far less time to identify changes during the merge process.

Once the feature branch has been successfully merged back into Main, the isolated solutions are reunited with the rest of the platform. And without any missing dependencies,
  1. Perform TFS merge from feature to main
  2. Switch back all the NuGet references for actual relative path references
  3. Remove all redundant "packages.config"
  4. Remove all redundant "packages.config" references in "repository.config"
  5. Dependency discovery
  6. Start build

Hot Fixes

During the creation of a new branch, the Q&A will work out whether you're taking the 'Head' of the repository, or going back in time to a previous build. 

If you select an older build label, you are presumed to be performing a hot-fix on the current production version. The Q&A script will present a selection of 'QA approved' builds from TFS which will serve as the base of the hot-fix branch. 

The NuGet packages used to substitute absent domains are fixed to the same version of platform as the hot-fix branch source. This ensures that complete compatibility of the domains affected by the hot-fix with the current production version.

We typically release weekly, and so hot-fix branches are used frequently by the UI developers for getting quick updates out to production ahead of the next release.

The results

Full platform (Before)
This is how the world looked before we undertook the isolated feature work.

  • Disk footprint: 9GB
  • Time to first line of code: 50m



Partial build (isolated feature)
This is a bit finger-in-the-air, as the code affected by a feature varies greatly depending on the scale o the feature.

  • Disk footprint: <500MB
  • Time to first line of code:  5m


Headline benefits
The things that really matter to the people I support:

  • We didn't have to buy another build agent after all
  • Work on a feature can start 45 minutes sooner
  • Our CI builds are over 6 minutes quicker
  • Build agent queue lengths are practically gone.
  • Publishing is over 20 minutes quicker
  • You can work on any part of the platform you need too
  • Back to a single manageable branch! 
Development time
Took approximately 4 weeks.

*Typically, up to 3 solutions are affected by a feature.

Friday, 3 January 2014

Self hosting WCF applications

netsh http add urlacl url=http://+:8888/WebsiteFolder user=Everyone