.NET, Sitecore and setup development

This blog is used as a memory dump of random thoughts and interesting facts about different things in the world of IT.

VSTS and TeamCity Commit Status Publisher

Some time ago VSTS team added a feature called Pull Request Status Extensibility. It unlocked the door for the external services to post custom statuses to the pull requests created in Git repositories hosted in VSTS. Once the status is posted, it is possible to make a branching policy out of it, and this fact makes it a powerful feature.

According to the VSTS Feature Timeline Pull Request Status Extensibility will arrive in On-Prem TFS 2018 RC1 and future.

Fortunately, TeamCity has just added the option to send pull request statuses to its Commit Status Publisher in the most recent build of the version 2017.2.

At the moment of writing this post, the version 2017.2 is still an EAP, and I’ll use 2017.2 EAP4 as the first build the feature has arrived with.

These two pieces assemble in a nice picture where you can host your project in VSTS while keeping the build part entirely in TeamCity. In this post, I’ll guide you through the steps required to configure this beautiful setup.

TeamCity: basic setup of the build project

To begin with, we’ll add a connection to VSTS in TeamCity. It is not required, but helps a lot in the further configuration of VCS root and build features. Navigate to Administration > <Root Project> > Connections and click “Add Connection” button:

Now, let’s create a new build project. Thanks to the connection configured prior to this step, the VCS root configuration is as easy as clicking a Visual Studio icon:

Choose the repository we’d like to target and TeamCity will form the proper clone URL. Note that Branch Specification field is set to watch pull requests too.

For the sake of this demo the build project itself is quite simple: it contains just one build configuration, which in its turn consists of a single PowerShell build step faking the real build process by several seconds sleep. There’s also a VCS trigger to run the build on the changes in default branch (+:<default>) as well as pull request merges (+:pull/*/merge).

Finally, we should configure the Commit Status Publisher, which does all the magic. Switch to the Build Feature on the left pane, and click “Add Build Feature” button:

Note the checkbox that hides in the Advanced Options view. It should be turned on in order to enable pull request status publishing.

Ideally, you should generate another personal access token in VSTS with only Code (status) and Code (read) scopes specified. However, being lazy, I’ve just clicked the magic wand icon and TeamCity pulled the all-scopes access token from the connection.

VSTS: creating a pull request with status from TeamCity

Now, when we’re done with TeamCity configuration, let’s go ahead and create a pull request in out VSTS Git repository. When TeamCity detects the change, it starts building the pull request. At the same time, the pull request view in VSTS displays appropriate status:

Once the build has completed, the status is refreshed:

If you click the link, it navigates to the completed build page in TeamCity:

VSTS: Make branch policy out of the TeamCity build status

As long as the external service has published its status to the pull request once, it is possible to configure it to serve as a branch policy for this and all other pull requests in this repository. Let’s do this now.

Navigate to the branch policies of the master branch and click “Add Service” in “Require approval from external services” section:

Choose the target service from the dropdown (its name is combined of TeamCity build project and configuration) and modify other options according to your needs. Note that it is possible to configure the service the way it behaves as a normal branch policy. For example, the status can be required and will expire when the source branch gets an update:

Finally, click Save and push some other change to the existing pull request. As soon as the pull request is updated, the Status section disappears and a new policy is displayed. It stays in the waiting mode until the TeamCity build is started:

Once the build is started, the policy status changes to Pending:

Finally, when the build is done, it is also reflected on the custom policy status:

Similar to the pull request status behavior, it is possible to click the link and navigate to the build view in TeamCity.

TeamCity: build normal branches and post the status back to VSTS

When we merge the pull request, the build of the master branch is triggered in TeamCity. If you switch to the Branches view in VSTS, you can see the In Progress type of icon in the Build column of the master branch:

Once the build is completed, the icon changes to the appropriate state (Success in our case):


In this article, we’ve quickly run through the steps required to configure close integration between VSTS Git repository and TeamCity build project. Note that I haven’t written a single line of code for this to happen. This setup might be useful for those projects that have extensive build configuration in TeamCity, but would like to benefit from the fantastic pull request user experience in VSTS.


Today I have faced with the problem installing Basic Authentication feature into the Web Server role on Windows 2012 R2. The wizard kept throwing various errors, including scary OutOfMemoryException. A quick googling that has found a suggestion to run netsh http show iplisten and add (aka Home Sweet Home) to the list if it’s not there. I gave it a try without giving it a thought first.

The initial problem has not been solved – the wizard kept failing to add that feature, and I finally resolved it with the mighty PowerShell:

Import-Module ServerManager
Add-WindowsFeature Web-Basic-Auth

Later on I had to browse for a website hosted on that server, and I suddenly saw This webpage is not available message. Hmm… First off, I’ve verified that the website works locally – and it did. This gave me another hint and I checked whether the bindings are set up correctly. And they were! Finally, I started to think that it’s Basic Authentication feature to blame – yeah, I know, that was a stupid assumption, but hey, stupid assumptions feel very comfortable for brain when it faces with the magic…

Anyway, fortunately I recalled that quick dumb action I did with netsh, and the magic has immediately turned into the dust, revealing someone’s ignorance… Turns out, if iplisten does not list anything, it means listen to everything, any IP address. When you add something there, it starts listening to that IP address only.

Thus, it was all resolved by deleting from that list with netsh http delete iplisten ipaddress=

Want some quick conclusion? Think first, then act!!!

Written with StackEdit.

Build Queue Starvation in CC.NET

Recently I’ve come across an interesting behavior in CruiseControl.NET in regards to the build queues and priorities.

If there are many projects on one server, and the server it not quite powerful, and more than one build queue is configured, and (that’s the last one) these build queues have different priorities, you might end up in a situation when CC.NET checks for modifications the same set of projects all over again, and never starts an actual build. If you add the projects from that server to the CCTray, you can observe the number of projects queued for the build has reached a certain number and never decreases.

This phenomenon is call “build queue starvation”. It was described and explained by Damir Arh in his blog.

Let me summarize the main idea.

When one build queue has a higher priority than another queue, CC.NET favors the projects from the first queue when scheduling for modifications check. Now imagine that a trigger of the projects from the higher priority queue is quite small and the number of such projects is big enough. This leads to the situation when the first project in high priority queue is scheduled for build the second round before the last project in that queue has built its first time.

As a result, the lower priority queue is “starving” – none of its projects ever gets a chance to be built. The fix suggested in the link above suits my needs – the trigger interval has just been increased.

I should say it’s not easy to google that if you’re not familiar with the term “build queue starvation”. Besides, CC.NET doesn’t feel bad in this situation, and hence doesn’t help with any warnings – it just does its job iterating the queue and following the instructions.

Written with StackEdit.

Setting Up an Existing Blog on Octopress

Ok, it took me some time and efforts to set up the environment for blogging. Consider this post as a quick instruction to myself for the next time I’ll have to do this.

So, there’s an existing blog created with Octopress, hosted on Github. The task is to setup a brand new machine to enable smooth blogging experience.

Note: just in case you have to create a blog from scratch, follow the official Octopress docs, it’s quite clear.

First of all, you should install Ruby. Octopress docs recommend using either rbenv or RVM for this. Both words sound scary, hence don’t hesitate to take the easy path and download an installer from here. At the last page of the installation wizard, choose to add Ruby binaries to the PATH:

When installer completes, check the installed version:

ruby --version

Then, clone the repo with the blog from Github. Instead of calling rake setup_github_pages as suggested by the Octopress docs, follow these steps found here. Let’s assume we’ve done that into blog folder:

git clone git@github.com:username/username.github.com.git blog
cd blog
git checkout source
mkdir _deploy
cd _deploy
git init
git remote add origin git@github.com:username/username.github.com.git
git pull origin master
cd ..

Now do the following:

gem install bundler
bundle install

This should pull all the dependencies required for the Octopress engine. Here’s where I faced with the first inconsistency in the docs – one of the dependencies (fast-stemmer) fails to install without the DevKit. Download it and run the installer. The installation process is documented here, but the quickest way is:

  • self-extract the archive
  • cd to that folder
  • run ruby dk.rb init
  • then run ruby dk.rb install

After this, re-run the bundle install command.

Well, at this point you should be able to create new posts with rake new_post[title] command. Generate the resulting HTML with rake generate and preview it with rake preview to make sure it produces what you expect.

An important note about syntax highlighting

Octopress uses Pygments to highlight the code. This is a Python thing, and obviously you should install Python for this to work. Choose 2.x version of Python – the 3.x version doesn’t work. This is important: you won’t be able to generate HTML from MARKDOWN otherwise.

That’s it! Hope this will save me some time in future.

And by the way, this all is written with StackEdit – a highly recommended online markdown editor.

Migrate Attachments From OnTime to TFS

When you move from one bug tracking system to another, the accuracy of the process is very important. A single missing point can make a work item useless. An attached image is often worth a thousand words. Hence, today’s post is about migrating attachments from OnTime to TFS.

NOTE: The samples in this post rely on OnTime SDK, which was replaced by a brand new REST API.

OnTime SDK is a set of web services, and each “area” is usually covered by one or a number of web services. The operations with attachments are grouped in /sdk/AttachmentService.asmx web service.

So, the first thing to do is to grab all attachments of the OnTime defect:

var rawAttachments = _attachmentService.GetAttachmentsList(securityToken, AttachmentSourceTypes.Defect, defect.DefectId);

This method returns a DataSet, and you’ll have to enumerate its rows to grab the useful data:

var attachments = rawAttachments.Tables[0].AsEnumerable();
foreach (var attachment in attachments)
  // wi is a TFS work item object

Now, let’s take a look at the GetAttachment method, which actually does the job. It accepts the DataRow, and returns the TFS Attachment object:

private Attachment GetAttachment(DataRow attachmentRow)
  var onTimeAttachment = _attachmentService.GetByAttachmentId(securityToken, (int)attachmentRow["AttachmentId"]);

  var tempFile = Path.Combine(Path.GetTempPath(), onTimeAttachment.FileName);
  if (File.Exists(tempFile))
  File.WriteAllBytes(tempFile, onTimeAttachment.FileData);

  return new Attachment(tempFile, onTimeAttachment.Description);

Couple of things to notice here:

  • you have to call another web method to pull binary data of the attachment
  • OnTime attachment metadata is rather useful and can be moved as is to TFS, for instance, attachment description

Finally, when a new attachment is added to the TFS work item, “increment” the ChangedDate of the work item before saving it. The TFS server often refuses saving work item data in case the previous revision has exactly the same date/time stamp. Like this (always works):

wi[CoreField.ChangedDate] = wi.ChangedDate.AddSeconds(5);

Hope it’s useful. Good luck!

NAnt Task Behaves Differently in 0.92 and Prior Versions

If you need to copy a folder together with all its contents to another folder in NAnt, you would typically write something like this:
<copy todir="${target}">
<fileset basedir="${source}" />
It turns out this code works correctly in NAnt 0.92 Alpha and above. The output is expected:
[copy] Copying 1 directory to '...'.
However, the same code doesn’t work in prior versions of NAnt, for instance, 0.91. The output is as follows (only in –debug+ mode):
[copy] Copying 0 files to '...'.
Obviously, the issue was fixed in 0.92, so the best recommendation would be to upgrade NAnt toolkit. However, if this is not an option for some reason, the following code seems to work correctly for any version:
<copy todir="${target}">
<fileset basedir="${source}">
<include name="**/*" />
Hope this saves you some time.

Possible Source of the Signtool ‘Bad Format’ 0x800700C1 Problem

Today I have faced with a weird problem. The operation to sign the EXE file (actually, an installation package) with a valid certificate failed with the following error:
[exec] SignTool Error: SignedCode::Sign returned error: 0x800700C1
[exec] Either the file being signed or one of the DLL specified by /j switch is not a valid Win32 application.
[exec] SignTool Error: An error occurred while attempting to sign: D:\output\setup.exe
This kind of error is usually an indication of a format incompatibility, when the bitness of the signtool.exe and the bitness of the EXE in question don’t correspond. However, this was not the case.

It turns out that the original EXE file was generated incorrectly because of the lack of disk space. That’s why it was broken and was recognized by the signtool like a bad format file. After disk cleanup everything worked perfectly and the EXE file was signed correctly.

Hope this saves someone some time.

A Solution Can Build Fine From Inside the Visual Studio, but Fail to Build With msbuild.exe

Today I have faced with an interesting issue. Although I failed to reproduce it on a fresh new project, I think this info might be useful for others.
I have a solution which was upgraded from targeting .NET Framework 2.0 to .NET Framework 3.5. I’ve got a patch from a fellow developer to apply to one of the projects of that solution. The patch adds new files as well as modifies existing ones. After the patch application, the solution is successfully built from inside the Visual Studio, but fails to build from the command line with msbuild.exe. The error thrown states that
“The type or namespace name 'Linq' does not exist in the namespace 'System' ”. 
The msbuild version is 3.5:
[exec] Microsoft (R) Build Engine Version 3.5.30729.5420
[exec] [Microsoft .NET Framework, Version 2.0.50727.5456]
[exec] Copyright (C) Microsoft Corporation 2007. All rights reserved.
It turns out this issue has been met by other people, and even reported to Microsoft. Microsoft suggested to use MSBuild.exe 4.0 to build VS 2010 projects. However, they confirmed it is possible to use MSBuild.exe 3.5  - in this case a reference to System.Core ( must be explicitly added to the csproj file.
If you try to add a reference to System.Core from inside the Visual Studio, you’ll get the error saying:
"A reference to 'System.Core' could not be added. This component is already automatically referenced by the build system"
So, it seems that when you build a solution from inside the Visual Studio, it is capable to automatically load implicitly referenced assemblies. I suppose, MSBuild.exe 4.0 (and even SP1-patched MSBuild.exe 3.5?) can do this as well. Apparently, this has also turned out to be a known problem – you can’t add that reference from the IDE. Open csproj file in your favorite editor and add this:
<Reference Include="System.Core" />
After this, the project builds fine in both VS and MSBuild.

Default Attribute Values for Custom NAnt Tasks

When you create custom NAnt tasks, you can specify various task parameter characteristics, such as whether it is a required attribute, how it validates its value, etc. This is done via the custom attributes in .NET, for example:
[TaskAttribute("param", Required = true), StringValidator(AllowEmpty = false)]
public string Param { get; set; }
It might be a good idea to be able to specify a default value for a task parameter the similar way, for instance:
[TaskAttribute("port"), Int32Validator(1000, 65520), DefaultValue(16333)]
public int Port { get; set; }
Let’s examine the way it can be implemented. First of all, let’s define the custom attribute for the default value:
/// <summary>
/// The custom attribute for the task attribute default value
/// </summary>
public class DefaultValueAttribute : Attribute
public DefaultValueAttribute(object value)
this.Default = value;

public object Default { get; set; }
I suppose the standard .NET DefaultValueAttribute can be used for this purpose as well, but the one above is very simple and is good for this sample. Note also that in this situation we could benefit from the generic custom attributes, which unfortunately are not supported in C#, although are quite valid for CLR.

Now, when the attribute is defined, let’s design the way default values will be applied at runtime. For this purpose we’ll have to define a special base class for all our custom tasks we’d like to use default values technique:
public abstract class DefaultValueAwareTask : Task
protected override void ExecuteTask()

protected virtual void SetDefaultValues()
foreach (var property in GetPropertiesWithCustomAttributes<DefaultValueAttribute>(this.GetType()))
var attribute = (TaskAttributeAttribute)property.GetCustomAttributes(typeof(TaskAttributeAttribute), false)[0];
var attributeDefaultValue = (DefaultValueAttribute)property.GetCustomAttributes(typeof(DefaultValueAttribute), false)[0];

if (attribute.Required)
throw new BuildException("No reason to allow both to be set", this.Location);

if (this.XmlNode.Attributes[attribute.Name] == null)
property.SetValue(this, attributeDefaultValue.Default, null);

private static IEnumerable<PropertyInfo> GetPropertiesWithCustomAttributes<T>(Type type)
return type.GetProperties(BindingFlags.DeclaredOnly | BindingFlags.Public | BindingFlags.Instance).Where(property => property.GetCustomAttributes(typeof(T), false).Length > 0);
Let’s examine what this code actually does. The key method here is SetDefaultValues(). It iterates through the task parameters (the public properties marked with DefaultValueAttribute attribute) of the class it is defined in and checks whether the value carried by the DefaultValueAttribute should be set as a true value of the task parameter. It is quite simple: if the XmlNode of the NAnt task definition doesn’t contain the parameter in question, it means a developer didn’t set it explicitly, and it is necessary to set a default value. Moreover, if the task parameter is marked as Required and has a default value at the same time, this situation is treated as not appropriate and the exception is thrown.

Obviously, when a custom NAnt task derives from the DefaultValueAwareTask, it has to call base.ExecuteTask() at the very start of its ExecuteTask() method implementation for this technique to work.

Generate a Solution File for a Number of C# Projects Files in a Folder

Some time ago I wrote my first T4 template which generates a solution (*.sln) file out of a number of C# project (*.cspoj) files, located in a folder and all descendants. Although it turned out not to be necessary to solve the task I was working on, and assuming it’s quite simple, I still decided to share it for further reference. May someone can find it useful. So, below is the entire T4 template, with no extra comments:
Microsoft Visual Studio Solution File, Format Version 11.00
# Visual Studio 2010
<#@ template language="cs" hostspecific="false" #>
<#@ output extension=".sln" #>
<#@ parameter name="Folder" type="System.String" #>
<#@ assembly name="System.Core" #>
<#@ assembly name="System.Xml" #>
<#@ assembly name="System.Xml.Linq" #>
<#@ import namespace="System.IO" #>
<#@ import namespace="System.Linq" #>
<#@ import namespace="System.Xml.Linq" #>
if (Directory.Exists(Folder))
var csprojFiles= Directory.GetFiles(Folder, "*.csproj", SearchOption.AllDirectories);
foreach (var file in csprojFiles)
ProjectFileMetaData metaData = new ProjectFileMetaData(file, Folder);
WriteLine("Project(\"{3}\") = \"{0}\", \"{1}\", \"{2}\"", metaData.Name, metaData.Path, metaData.Id, ProjectFileMetaData.ProjectTypeGuid);

public class ProjectFileMetaData
public static string ProjectTypeGuid = "{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}";

public ProjectFileMetaData(string file, string root)
InitProperties(file, root);

public string Name { get; set; }

public string Path { get; set; }

public string Id { get; set; }

private void InitProperties(string file, string root)
XDocument xDoc = XDocument.Load(file);
XNamespace ns = @"http://schemas.microsoft.com/developer/msbuild/2003";
XElement xElement = xDoc.Root.Elements(XName.Get("PropertyGroup", ns.NamespaceName)).First().Element(XName.Get("ProjectGuid", ns.NamespaceName));
if (xElement != null)
this.Id = xElement.Value;

this.Path = file.Substring(root.Length).TrimStart(new char[] { '\\' });

this.Name = System.IO.Path.GetFileNameWithoutExtension(file);