Thursday, November 10, 2016

It’s time the DevOps “fad” is over now!

This is a rant that has been building up for a while, so if you are strongly opinionated maybe you should move on to the next blog post and skip this one…

It is my opinion that it is time for the “next” buzzword to start making its appearance in IT. Maybe we should call it Tom, or maybe COD (Container Orientated Development); ooh what about DevCon (Development Containers), you know ‘cause containers is one of the things right now.

Anyway, let’s not get side-tracked. DevOps is one of those things that most people do not understand, but everyone wants a finger in the pie. Unfortunately, vendors probably have a lot of blame in this field, supplying DevOps services and practitioners, throwing DevOps into every second sentence. I kid you not, I was in a meeting last week where a “DevOps” vendor was literarily interchanging CI (Continuous Integration) with DevOps: “I have the build setup for CI, you know DevOps”. Wonder what Donovan Brown would say about that… It is like people that religiously follow a Banting Diet and yet still consumes loads of beer. They do not understand the principles, or what they are supposed to be doing, and/or why they are supposed to be doing it!

Execs hear the world speaking about DevOps and they want themselves a piece of that pie. So the first person/vendor that comes along and throws in the words at strategic places gets the jobs.

I had a discussion today with a client that is “big” into the DevOps thing. They want to remove all rights from developers to be able to branch their Git repositories. In my mind that goes against the two, largely unspoken, aspects of DevOps namely People and Process. Sure the Tool will be able to do branching and remove rights from groups of people, but won’t losing the ability to branch when needed impede the process of agility and cause the people to be unproductive while they are trying to slash red tape and justify the creation of new branches?

Getting back to my initial point; There are a few strong voices that barely reach out above all the noise around DevOps, and they talk a good talk. If you look back over IT’s (short) history you notice that once the big self-serving hordes have moved on to the “next” thing, then the die-hards can get to bedding down concepts and approaches. Look at scrum, ALM, a lot of developer practices. I realise that this means that as a vendor myself, I will find it harder to stay afloat, but for my own sanity maybe it is worth it!

Wednesday, September 14, 2016

VSTS Extension Work Item Limits

An error was recently pointed out on one of the extension that I’ve been working on as part of the ALM Rangers.

Looking at the browser logs we noticed something like this:
"An undefined error occurred while attempting to connect to the server. Status code 0: error."

image_thumb2

Very descriptive, right?!
It turns out that the root of the error lies in the fact that we were hitting limits with regards to work item queries. It looks like VSTS/TFS API only allows us to "read" in the region of about 300 work items at a go. It appears that there are teams out there that need up to a few thousand at a go.

How did I fix this? A great little JavaScript library called Q.
It was a fairly simple change as we all use the promise pattern for asynchronous calls to the services. Q is a chaining library that allows me to create a bunch of promises and wait for them to complete execution.

Practically this means breaking the list of work item id’s that I need into smaller chunks and then fetching these chunks. We could have simply written a recursive loop that would perform the same, but I’m lazy.

The code change looked like this:

From simply calling the API with all the work item id’s:

client.getWorkItems(backlogIds, null, asOfDate, WorkItemContracts.WorkItemExpand.Relations).then(backlogWorkItems => {
  //process
  defer.resolve(result);
});

To:

var loadSpecs = new Array<IPromise<any>>();
var spliceSize = 100;
var backlogSection = backlog.splice(0, spliceSize);

while (backlogSection.length > 0) {
   loadSpecs.push(this.GetWorkItemDetails(backlogSection, asOfDate));
   var backlogSection = backlog.splice(0, spliceSize);
}

Q.all(loadSpecs).done(all => {
  //combine “all” the results into one
   defer.resolve(result);
});

The GetWorkItemDetails method simple returns the promise from client.getWorkItems:

public GetWorkItemDetails(backlogItems: number[], asOf: Date): IPromise<any> {
   var client = WorkItemRestClient.getClient();
   var defer = $.Deferred<any>();
   client.getWorkItems(backlogItems, null, asOf,
   WorkItemContracts.WorkItemExpand.Relations)
        .then(backlogWorkItems =>   
   {
      //process
     
defer.resolve(result);
   });
   return defer;
}

This may not be the neatest, and please do not criticize my Javascript skills (I’m not a JS developer Surprised smile ), but it works and was a lot quicker to “fix” than expected.

For the full source code feel free to go and have a look at the github repo, in fact why not join in and make it better ! Open-mouthed smile

Wednesday, April 13, 2016

Upload Custom Build Tasks On-prem TFS 2015 Update 2

One of the great features that TFS 2015 Update 2 brings to the party is the ability to add extensions and custom build tasks.

I was sorely missing the SQL dacpac deployment task that has been available on VSTS for a while, so I decided to upload it myself.

First of all, all the source for the VSTS / TFS build tasks is actually available. If you have not already, head over to Microsoft's GIT repository and take a look for yourself. You may notice that the SqlServerDacpacDeployment is just sitting there, ripe for the picking…

To get the build tasks uploaded to your on-prem is not as straight forward as it would seem though. First of all, you need the TFS Cross Platform Command Line (tfx) command line utility to upload the build tasks. It in turn requires NodeJS. Once all that is installed you can start uploading your extensions and build tasks… well almost.

Tfx does not yet support integrated authentication, and on-prem versions of TFS do not yet have "Personal Access tokens" or PATs. Tfx does however support basic authentication, which means we need to tweak our TFS instance a bit to be able to upload our own tasks.

TFS Basic_thumb[1]

We need to get onto the TFS server and open up IIS. Select the "tfs" application under the Team Foundation Server site and enable basic authentication.

Once you have done that you are ready to upload your tasks. After downloading the task repo from Microsoft I simply opened up a command prompt and executed the following command :
tfx build tasks upload --service-url http://<<server>>:8080/tfs --auth-type basic --username <<username>> --password <<password>> --task-path .\SqlServerDacpacDeployment

Interestingly enough that did not work, for the life of me I could not see the task in the list. I eventually figured out that in the task.json manifest there was a "visibility" section. The first item was "preview" and this seemed to stop the task from being "shown" somehow. After removing that it worked like a charm.

"visibility": [
   "Build",
   "Release"
],

Now I can play around with deploying dacpac's wlEmoticon-smile[2]

Wednesday, March 9, 2016

DevConf 2016 A Raging Success

Unless you have been living under a rock in South Africa, you should have been aware that DevConf was held yesterday.

I must admit it was an awesome day with everyone that gathered and all the energy that they brought!

I was lucky enough to be the first speaker in the DevOps track with my session “DevOps Demystified”. I was impressed, having  nearly full room (about 100 people) attending, and hope that everyone gained some value.

My slides can be found here

Kudos to the organisers and looking forward to the next one Smile

Wednesday, February 17, 2016

Application Insights and TypeScript

I’m actively involved in creating extensions on VSTS and one of the questions that comes up a lot is on Telemetry. Are people using the extensions, how are they using it and what about errors and exceptions? It has become such a topic of discussion that Will Smythe has actually gone ahead and given some guidance on how to add Application Insights (AppInsights) telemetry to your extension.

He gives a good overview and example of using AppInsights in a simple JavaScript (and in fact an html page) type application. Personally I prefer using TypeScript to do my JS development.

The method that Will explains and Typescript do not mix as seamlessly as I would like. Luckily there is hope.
Microsoft also provides a TypeScipt type definition for their AppInsights api. Currently it is in preview, but I have not had any problems with it.

You can install it via NPM ilke this:

Install-Package Microsoft.ApplicationInsights.TypeScript -Pre

Once it is installed, it will dump the libraries into the packages folder in the root of the project. There should be two libraries, the JavaScript and the Typescript types.
clip_image002

Under the JavaScript folder (Microsoft.ApplicationInsights.JavaScript.0.21.5-build00175 in this case) you will find the scripts in the content\scripts folder. It has two versions, the full and minified version of the library. In our instance simply copy the minified version (ai.0.21.5-build00175.min.js in this instance) to the scripts folder of your extension.

The package should already have added the typescript definition file to your scripts folder, but in case it has not, under the TypeScript folder (Microsoft.ApplicationInsights.TypeScript.0.21.5-build00175 in this case) you will also find content\scripts folder that contains the type definition (ai.0.21.5-build00175.d.ts in this instance). Copy that to your TypeScript definitions folder.

Now you should be ready to use them. Include the JavaScript library in your extension and reference it in your html page, and add a reference to the TypeScript type definition in your TypeScript files.

/// <reference path="ai.0.21.5-build00175.d.ts" />

To use the library, you need to configure it using a configuration snippet. The configuration snippet contains the instrumentation key that has been setup (following Will’s post):

var snippet: any = {
   config: {
      instrumentationKey: "<<your instrumentation key>>"
   }
};

You can pass this into the initialisation object :

var init = new Microsoft.ApplicationInsights.Initialization(snippet);

And then from the initialisation object you create an AppInsights instance:

var applicationInsights = init.loadAppInsights();

On the AppInsights instance you can go ahead and start to capture your telemetry using the following:

trackPageView(name?: string, url?: string, properties?: Object, measurements?: Object, duration?: number): void;
trackEvent(name: string, properties?: Object, measurements?: Object): void;
trackAjax(absoluteUrl: string, pathName: string, totalTime: number, success: boolean, resultCode: number): void;
trackException(exception: Error, handledAt?: string, properties?: Object, measurements?: Object): void;
trackMetric(name: string, average: number, sampleCount?: number, min?: number, max?: number, properties?: Object): void;
trackTrace(message: string, properties?: Object): void;

The full code would look something like this:

var snippet: any = {
   config: {
      instrumentationKey: "<<your instrumentation key>>"
   }
};

var init = new Microsoft.ApplicationInsights.Initialization(snippet);
var applicationInsights = init.loadAppInsights();
applicationInsight.trackPageView("Index");
applicationInsight.trackEvent("PageLoad");
applicationInsight.trackMetric("LoadTime", timeMeasurement);

Wednesday, February 10, 2016

Intro to VS Team Services Extensions

Over the last couple of months I have been quite busy with various VSTS extension as part of the ALM Rangers.
You can see some of the extensions that I have developed here and here, and I am currently involved with at least 3 others.

As a quick guidance we decided to do what we call a brownbag session (informal, bring your bagged lunch and listen in type of session) to try and get more of the rangers involved and up to speed.
The session was published on channel 9, so if you are interested in getting up to speed on creating your own extensions, feel free to give it a listen..

Feedback is always welcome.