Tuesday, 29 October 2013

Using Twitter as a build status board

Twitter is an ideal medium for creating a build board. Everyone has an account, team members can elect how and when they'd like to be informed, it works on a multitude of devices for the BYOD crowd.

You need a Twitter account, and a Twitter development account so that you can define an application and request the necessary Keys and Tokens.

But once you have this information, you're good to send your first tweets

Tweet Deck Illustration




TweetDeck employed as an 'activity board'



Function Set-TwitterStatus
{
    param (
        [string]$tweet)

    [Reflection.Assembly]::LoadWithPartialName("System.Security")  | Out-Null
    [Reflection.Assembly]::LoadWithPartialName("System.Net")  | Out-Null
  
    $status = [System.Uri]::EscapeDataString($tweet);  
    $oauth_consumer_key = "";  
    $oauth_consumer_secret = "";  
    $oauth_token = "";  
    $oauth_token_secret = "";  
    $oauth_nonce = [System.Convert]::ToBase64String([System.Text.Encoding]::ASCII.GetBytes([System.DateTime]::Now.Ticks.ToString()));  

    $ts = [System.DateTime]::UtcNow - [System.DateTime]::ParseExact("01/01/1970", "dd/MM/yyyy", $null).ToUniversalTime();  
    $oauth_timestamp = [System.Convert]::ToInt64($ts.TotalSeconds).ToString();  
  
    $signature = "POST&";  
    $signature += [System.Uri]::EscapeDataString("https://api.twitter.com/1.1/statuses/update.json") + "&";  
    $signature += [System.Uri]::EscapeDataString("oauth_consumer_key=" + $oauth_consumer_key + "&");  
    $signature += [System.Uri]::EscapeDataString("oauth_nonce=" + $oauth_nonce + "&");   
    $signature += [System.Uri]::EscapeDataString("oauth_signature_method=HMAC-SHA1&");  
    $signature += [System.Uri]::EscapeDataString("oauth_timestamp=" + $oauth_timestamp + "&");  
    $signature += [System.Uri]::EscapeDataString("oauth_token=" + $oauth_token + "&");  
    $signature += [System.Uri]::EscapeDataString("oauth_version=1.0&");  
    $signature += [System.Uri]::EscapeDataString("status=" + $status);  
  
    $signature_key = [System.Uri]::EscapeDataString($oauth_consumer_secret) + "&" + [System.Uri]::EscapeDataString($oauth_token_secret);  
  
    $hmacsha1 = new-object System.Security.Cryptography.HMACSHA1;  
    $hmacsha1.Key = [System.Text.Encoding]::ASCII.GetBytes($signature_key);  
    $oauth_signature = [System.Convert]::ToBase64String($hmacsha1.ComputeHash([System.Text.Encoding]::ASCII.GetBytes($signature)));  
  
    $oauth_authorization = 'OAuth ';  
    $oauth_authorization += 'oauth_consumer_key="' + [System.Uri]::EscapeDataString($oauth_consumer_key) + '",';  
    $oauth_authorization += 'oauth_nonce="' + [System.Uri]::EscapeDataString($oauth_nonce) + '",';  
    $oauth_authorization += 'oauth_signature="' + [System.Uri]::EscapeDataString($oauth_signature) + '",';  
    $oauth_authorization += 'oauth_signature_method="HMAC-SHA1",'  
    $oauth_authorization += 'oauth_timestamp="' + [System.Uri]::EscapeDataString($oauth_timestamp) + '",'  
    $oauth_authorization += 'oauth_token="' + [System.Uri]::EscapeDataString($oauth_token) + '",';  
    $oauth_authorization += 'oauth_version="1.0"';  



    $post_body = [System.Text.Encoding]::ASCII.GetBytes("status=" + $status);   
    [System.Net.HttpWebRequest] $request = [System.Net.WebRequest]::Create("https://api.twitter.com/1.1/statuses/update.json");  
    $request.Method = "POST";  
    $request.Headers.Add("Authorization", $oauth_authorization);  
    $request.ContentType = "application/x-www-form-urlencoded";  
    $body = $request.GetRequestStream();  
    $body.write($post_body, 0, $post_body.length);  
    $body.flush();  
    $body.close();  
    $response = $request.GetResponse();
}

Then, you're good to create tweets to suit your needs.

Set-TwitterStatus "Build 1000 has completed with 0 errors"
Set-TwitterStatus "Build 1000 was deploy to OAT in 1m:55s"

Monday, 28 October 2013

Switching from MSTest to VSTest.Console

Having recently leap-frogged from Visual Studio 2010 to 2013, it was time to update our ALM tooling
to match.

Note to self: A lot of this might be redundant now I've learnt that you can implement your own custom logger for VSTest.  Revist this later.

Test automation and code coverage

In the 2010 era, we used MSTest and NCover to perform our automated tests and coverage analysis respectively.
Now, in the 2013 era, we've made the jump to VSTest to provide test automation and coverage analysis.

VSTest command line

No alarms at this point, it all looks quite similar and fairly safe.

$VSTestexe = "c:\Program Files (x86)\Microsoft Visual Studio 11.0\Common7\IDE\CommonExtensions\Microsoft\TestWindow\vstest.console.exe"

& $VSTestexe "$($test.container)" "/logger:trx" "/Enablecodecoverage" "/InIsolation" | Out-file "$testRunReportFile"

But, you will now discover a number of irritations with using VSTest.
So, in an ascending order of anoyance:

  1. You can not specify the output file name for the test results
  2. You can not specify the output file name for the coverage report
  3. The coverage report is binary
  4. The results are still different to those generated within the Visual Studio IDE!!!!!!!!

Getting at the test results

Bit of a nuisance, but unlike MSTest, VSTest will not let you specify the name or location of the results output file! Bit of a shame, but not insurmountable.

The strategy I've adopted is to capture the console output of VSTest to a file, then using a regular expression, extract the file paths for the test results and coverage files.

$testResultsPattern = [regex]"\s?Results File:\s(.*\.trx)"

$testRunReport = Get-Content "$testRunReportFile" | Out-String

$generatedResultsFile = [Management.Automation.WildcardPattern]::Escape($testResultsPattern.Matches($testRunReport)[0].Groups[1].Value)

You may be wondering what the reasons are for the using the Escape method? Well, VSTest can produce files that contain [ and ] characters, which are interpreted as Wildcard operators in Powershell. So, we need to escape them.

So, with the filename of the .TRX results file, we're now free to open the XML and get working.

[xml]$testOutcomes = get-content $generatedResultsFile

Enabling code coverage

Is a simple matter of adding /Enablecodecoverage to the command line parameters.

Converting the binary coverage file to XML

Bit of a nuisance, but unlike NCover, VSTest will not let you specify the name or location of the coverage results output file! Bit of a shame, but not insurmountable.

The strategy I've adopted is to capture the console output of VSTest to a file, then using a regular expression, extract the file paths for the test results and coverage files.

$coverageResultsPattern =  [regex]"\s{2,}(.*\.coverage)"

$testRunReport = Get-Content "$testRunReportFile" | Out-String

$generatedCoverageFile = [Management.Automation.WildcardPattern]::Escape($coverageResultsPattern.Matches($testRunReport)[0].Groups[1].Value)

You may be wondering what the reasons are for the using the Escape method? Well, VSTest can produce files that contain [ and ] characters, which are interpreted as Wildcard operators in Powershell. So, we need to escape them.

So, onto the next problem, which is that the generated coverage file is in a Binary format, so taking advice from elsewhere on the internet, its possible with a bit of C# to convert this into XML.

A simple console application that converts the coverage binary to XML:

using Microsoft.VisualStudio.Coverage.Analysis;

namespace CoverageConverter
{
    class Program
    {
        static void Main(string[] args)
        {
            string path = System.IO.Path.GetDirectoryName(args[0]);

            using (CoverageInfo info = CoverageInfo.CreateFromFile(args[0],
                new string[] { path },
                new string[] { }))
            {
                CoverageDS data = info.BuildDataSet();

                data.WriteXml(args[1]);
                
            }
        }
    }
}

And once more, from the familiar realm of Powershell, we can invoke our new Coverage console app as follows:

$coverageConverter = Join-Path $($action.workingDir) "..\Tools\CoverageConverter\coverageconverter.exe"

 & $coverageConverter "TestRunResults.coverage" "TestRunResults.xml" | Out-Null

And we've now got an XML file we can use more easily with other tools.

Parsing the XML file with ReportsGenerator

This is the best bit.

An open source project called ReportsGenerator is very adept at converting coverage outputs into meaningful reports and statistics.

So, in my case, with a bit of powershell:

# Path to the executable
$reportGenerator = Join-Path $($action.workingDir) "..\Tools\CoverageConverter\reportgenerator.exe"

# Path to the XML file generated by the conversion console app
$xmlReportFile = Join-Path $action.data.output "$($action.data.report).xml" 

# Parameters for the ReportsGenerator
$reports = [system.String]::Join(";", ($coverageBinaries | % {"$_.xml"} ) )

$filter = "regex"

# Create an XML summary
& $reportGenerator -reports:$reports -targetdir:$action.data.output -reporttypes:XmlSummary -outputName:$action.data.report -Filters:$filter | Out-Null

# Create an HTML summary
& $reportGenerator -reports:$reports -targetdir:$action.data.output -reporttypes:HtmlSummary -outputName:$action.data.report -Filters:$filter | Out-Null

# Open the XML file for further analysis and usage.
[xml]$coverageAnalysis = Get-Content $xmlReportFile 
$blocksCovered = [int]$coverageAnalysis.CoverageReport.Summary.Coveredlines
$blocksUncovered  = [int]$coverageAnalysis.CoverageReport.Summary.Uncoveredlines
$totalBlocks  = [int]$coverageAnalysis.CoverageReport.Summary.Coverablelines

Failing a build

In the wider perspective of a build pipeline, now I have the aggregated coverage information from multiple runs of VSTest, I'm free to decide on whether I should accept or fail the build.

$totalCoverage = ($blocksCovered/$totalBlocks)  

if (($totalCoverage*100) -lt $action.data.coverage.minimumAllowed)
{
   $primaryResult.state = 1;
   $primaryResult.message.content = ("Coverage acheived: {1:P2}. Failed to acheive threshold of {0}%" -f  $action.data.coverage.minimumAllowed, $totalCoverage)
}