Showing posts with label VSTS. Show all posts
Showing posts with label VSTS. Show all posts

Monday, December 4, 2017

Tips and Tricks: User has Allow Delete work items, but no delete button on work item

Problem:

The user in TFS/VSTS has all the rights enabled to delete work items. The problem is that when you open the work items the "Delete" button is missing, and you do not have the option to delete in query lists.

When you look at the inherited rights you see something like this:

image

All indication is that the permission is allowed, but the end result is that it is denied.


Solution:

This may be that the user is in the "Stakeholder" access level. If you pay close attention to the "unavailable features" you will notice that deleting work items is one of the things a stakeholder can't do.

You can now either acquire a license for the user and move him/her into the basic access level or higher, or the user will need to ask someone that does hove those rights to perform the deletions.


Monday, November 27, 2017

Tips and Tricks: TFS Excel Plugin not loading

Problem:

Close an Excel spreadsheet with/or without having a TFS/VSTS connected list. When you open the spreadsheet again, the Team tab is missing in the ribbon bar.

You then need to go through and re-enable the plugin through Excel options to reload the plugin and get the Team tab back.

This also disconnects your TFS/VSTS linked worksheets, causing you to have to reconnect or re-open a query to carry on working with the work items.


Solution:

  1. With Excel closed, open up the registry editor (regedit)
  2. Then navigate to HKEY_CURRENT_USER\Software\Microsoft\Office\Excel\Addins , and find the TFCOfficeShim entries:
    clip_image002
    There may be more than one, and the version number at the end may differ to the image above
  3. I simply remove/delete the “older versions”  (I surely do not have to remind you to take a backup of your registry before you make any changes),
  4. and then find the “LoadBehavior” in the “folder” and make sure that the value is 3

Now re-open the spreadsheet and see if it will load automatically.


Monday, November 6, 2017

Using Office UI Fabric to create a VSTS Extension

In the beginning

In the beginning god wanted to make earth. So, god found a cool looking bootstrapper and loaded up his favorite command line.
God then stepped through and downloaded all 100’s of packages to be able to run the bootstrapper and then the bootstrapper itself:

npm install create-planet -g

God created earth create-planet earth and all was good for a few days. God then decided he wanted to do more. Humans were missing.

npm install human

+ <human@0.0.1>
added 1702 packages in 5904.903s


God then referenced "humans" in earth.js and tried to compile.

Module not found: Error: Can't resolve sin

Ok, no biggy, npm install sin. Compile.

[at-loader] ./node_modules/@types/Inhabitants/index.d.ts:90:13
TS2403: Subsequent variable declarations must have the same type. Variable
'intellegentlife' must be of type 'Homosapiens', but here has type 'Neanderthal'


Ok, remove package Neanderthal. Compile.

earthpack error -ERROR in earth.js from UglifyUniverse Unexpected token: name (continentParts)

Earthpack.. where did that come from? Ok...
5 hours later, god found the Earthpack config and was able to fix the configuration, and the compile was good.

So god ran earth and was promptly told:
Universe.js Error: Invariant Violation: Minified World error #379.

Then there was a big bang.

An now…

I took on a project a while back to summarise and write a “quick start” series of posts for the ALM/DevOps Rangers to highlight the usage of Office UI Fabric. This meant that I needed to get my head around changing an existing extension from “simple” Typescript to React while incorporating Office UI components.  As you may have summised from the above analogy, I have spent way to much time trying to get things running the way that I want them to, based off of someone else's concepts, intentions and bootstrapping.

If you are interested in the outcomes, please follow the series posted on the msdn blog:
  1. The start























Sunday, November 5, 2017

Containers on Azure as part of a CI/CD pipeline




In my previous posts I was speaking about the journey to go from setting up a container and ultimately publishing it in a continuous fashion to a registry using VSTS.

You may have noticed that a lot of time has elapsed since my last post, and there are a couple of reasons for that.

First of all, the impediment of work, secondly, I noticed a trend where it became fairly popular to blog about the journey from where I ended off and I hung back and followed those for a while.

Instead of me going ahead and creating a bunch of posts to show how to publish your container and run it in production, I’m going to hand it off to a bunch of other capable people :-)

To delve deeper, or just for more information, these are all good reads:


And if you are interested in a brief discussion on how to move to a microservices based architecture this is a good read : Modernizing a Monolithic Application using Microservices and Azure

 

Monday, January 30, 2017

Deploy Docker images to a Private Azure Container Registry

This post continues the journey of creating a dotnet application, containerizing and ultimately deploying the image to production.

The first thing we need to do is to get the source into a source repository (I’m of course going to use VSTS), then we need to configure a build and then push the images to a registry. We will then be able to deploy the images from the registry to our hosts, but more on that later.
Note: Some of these steps may incur some cost, so I would highly recommend at the very least creating a Dev Essentials account. This should cover any costs while we are playing.

I’m assuming you have already pushed your code to a repository in VSTS, so the next step is to create an Azure account, if you have not got one already, and then to setup a container registry.

To create your own private azure container registry to publish the images to:
  1. Login to azure
  2. Select container registries
    image 
  3. This should give you a list and you will need to click “add” to create a new container registry
  4. Fill in the required details and create a new registry
  5. Once created, open up the blade and select the Access Key settings. This should contain the registry name, login server and user name and password details (make sure the “Admin User” is enabled)
    image

Now lets move on to VSTS.
First we need to “connect” VSTS and your container registry:
  1. Login to your VSTS project and under settings, select the services configuration:
    image
  2. Using the details that were in the Access Key settings on the Azure container registry blade, create a docker registry service with your “Login Server” as the docker registry url and the user name and password:
    image

Finally it is time to create the builds. As you would expect, go add a new “empty” build definition that links to your source repository. Instead of selecting the “Hosted” build queue, use the “Hosted Linux Preview” queue. Docker is not available on the normal hosted windows agents yet.
Add 2 command line tasks and 3 docker tasks:
image

Note: If you do not have the docker tasks, then you will need to go and install them from the market place
Now configure the tasks as follows:

Command Line 1 Tool: dotnet
Arguments: restore
Advanced/Working Folder : The folder that your source is located in. In my case it was $(build.sourcesdirectory)/dotnet_sample/
Command Line 2 Tool: dotnet
Arguments : publish -c release -o $(build.sourcesdirectory)/dotnet_sample/output/ or an "output" folder under your source location
Advanced/Working Folder : see above
Docker 1 Docker Registry Connection : the service connection that you created earlier
Action : Build an image
Docker File : The location of your docker file. In my case it was $(build.sourcesdirectory)/dotnet_sample/dockerfile
Build Context: The location of your source code. In my case $(build.sourcesdirectory)/dotnet_sample
Image Name: The name and tag that you want to give your image. In my case I just used dotnet_sample:$(Build.BuildId)
Advanced/Working Folder : same as the other working folders
Docker 2 Docker Registry Connection : the service connection that you created earlier
Action : Run a Docker Command
Command : tag dotnet_sample:$(Build.BuildId) $(DockerRegistryUrl)/sample/dotnet_sample:$(Build.BuildId) the name must be the same as in the task above, and the $(DockerRegistryUrl) must be your Azure container registry url or login server
Advanced/Working Folder : same as the other working folders
Docker 3 Docker Registry Connection : the service connection that you created earlier
Action : Push an image
Image Name : The name you passed in when tagging your container above. In my case it was $(DockerRegistryUrl)/sample/dotnet_sample:$(Build.BuildId)
Advanced/Working Folder : same as the other working folders

Now you can save and queue the build. Hopefully it will look something like this:
image

If all has passed, a quick and easy way to see if your image is in your registry is to navigate to your docker registry’s catalogue url : “https://<<registry_url>>/v2/_catalog”. This will likely prompt you to login with the username and password that you setup previously and then you will download a json file. Opening this file will provide you with all the images hosted in your registry.

In this post we have moved from a locally created image to one residing in our private registry. In the next post we will continue the journey a bit further…

Wednesday, September 14, 2016

VSTS Extension Work Item Limits

An error was recently pointed out on one of the extension that I’ve been working on as part of the ALM Rangers.

Looking at the browser logs we noticed something like this:
"An undefined error occurred while attempting to connect to the server. Status code 0: error."

image_thumb2

Very descriptive, right?!
It turns out that the root of the error lies in the fact that we were hitting limits with regards to work item queries. It looks like VSTS/TFS API only allows us to "read" in the region of about 300 work items at a go. It appears that there are teams out there that need up to a few thousand at a go.

How did I fix this? A great little JavaScript library called Q.
It was a fairly simple change as we all use the promise pattern for asynchronous calls to the services. Q is a chaining library that allows me to create a bunch of promises and wait for them to complete execution.

Practically this means breaking the list of work item id’s that I need into smaller chunks and then fetching these chunks. We could have simply written a recursive loop that would perform the same, but I’m lazy.

The code change looked like this:

From simply calling the API with all the work item id’s:

client.getWorkItems(backlogIds, null, asOfDate, WorkItemContracts.WorkItemExpand.Relations).then(backlogWorkItems => {
  //process
  defer.resolve(result);
});

To:

var loadSpecs = new Array<IPromise<any>>();
var spliceSize = 100;
var backlogSection = backlog.splice(0, spliceSize);

while (backlogSection.length > 0) {
   loadSpecs.push(this.GetWorkItemDetails(backlogSection, asOfDate));
   var backlogSection = backlog.splice(0, spliceSize);
}

Q.all(loadSpecs).done(all => {
  //combine “all” the results into one
   defer.resolve(result);
});

The GetWorkItemDetails method simple returns the promise from client.getWorkItems:

public GetWorkItemDetails(backlogItems: number[], asOf: Date): IPromise<any> {
   var client = WorkItemRestClient.getClient();
   var defer = $.Deferred<any>();
   client.getWorkItems(backlogItems, null, asOf,
   WorkItemContracts.WorkItemExpand.Relations)
        .then(backlogWorkItems =>   
   {
      //process
     
defer.resolve(result);
   });
   return defer;
}

This may not be the neatest, and please do not criticize my Javascript skills (I’m not a JS developer Surprised smile ), but it works and was a lot quicker to “fix” than expected.

For the full source code feel free to go and have a look at the github repo, in fact why not join in and make it better ! Open-mouthed smile

Wednesday, April 13, 2016

Upload Custom Build Tasks On-prem TFS 2015 Update 2

One of the great features that TFS 2015 Update 2 brings to the party is the ability to add extensions and custom build tasks.

I was sorely missing the SQL dacpac deployment task that has been available on VSTS for a while, so I decided to upload it myself.

First of all, all the source for the VSTS / TFS build tasks is actually available. If you have not already, head over to Microsoft's GIT repository and take a look for yourself. You may notice that the SqlServerDacpacDeployment is just sitting there, ripe for the picking…

To get the build tasks uploaded to your on-prem is not as straight forward as it would seem though. First of all, you need the TFS Cross Platform Command Line (tfx) command line utility to upload the build tasks. It in turn requires NodeJS. Once all that is installed you can start uploading your extensions and build tasks… well almost.

Tfx does not yet support integrated authentication, and on-prem versions of TFS do not yet have "Personal Access tokens" or PATs. Tfx does however support basic authentication, which means we need to tweak our TFS instance a bit to be able to upload our own tasks.

TFS Basic_thumb[1]

We need to get onto the TFS server and open up IIS. Select the "tfs" application under the Team Foundation Server site and enable basic authentication.

Once you have done that you are ready to upload your tasks. After downloading the task repo from Microsoft I simply opened up a command prompt and executed the following command :
tfx build tasks upload --service-url http://<<server>>:8080/tfs --auth-type basic --username <<username>> --password <<password>> --task-path .\SqlServerDacpacDeployment

Interestingly enough that did not work, for the life of me I could not see the task in the list. I eventually figured out that in the task.json manifest there was a "visibility" section. The first item was "preview" and this seemed to stop the task from being "shown" somehow. After removing that it worked like a charm.

"visibility": [
   "Build",
   "Release"
],

Now I can play around with deploying dacpac's wlEmoticon-smile[2]

Wednesday, February 17, 2016

Application Insights and TypeScript

I’m actively involved in creating extensions on VSTS and one of the questions that comes up a lot is on Telemetry. Are people using the extensions, how are they using it and what about errors and exceptions? It has become such a topic of discussion that Will Smythe has actually gone ahead and given some guidance on how to add Application Insights (AppInsights) telemetry to your extension.

He gives a good overview and example of using AppInsights in a simple JavaScript (and in fact an html page) type application. Personally I prefer using TypeScript to do my JS development.

The method that Will explains and Typescript do not mix as seamlessly as I would like. Luckily there is hope.
Microsoft also provides a TypeScipt type definition for their AppInsights api. Currently it is in preview, but I have not had any problems with it.

You can install it via NPM ilke this:

Install-Package Microsoft.ApplicationInsights.TypeScript -Pre

Once it is installed, it will dump the libraries into the packages folder in the root of the project. There should be two libraries, the JavaScript and the Typescript types.
clip_image002

Under the JavaScript folder (Microsoft.ApplicationInsights.JavaScript.0.21.5-build00175 in this case) you will find the scripts in the content\scripts folder. It has two versions, the full and minified version of the library. In our instance simply copy the minified version (ai.0.21.5-build00175.min.js in this instance) to the scripts folder of your extension.

The package should already have added the typescript definition file to your scripts folder, but in case it has not, under the TypeScript folder (Microsoft.ApplicationInsights.TypeScript.0.21.5-build00175 in this case) you will also find content\scripts folder that contains the type definition (ai.0.21.5-build00175.d.ts in this instance). Copy that to your TypeScript definitions folder.

Now you should be ready to use them. Include the JavaScript library in your extension and reference it in your html page, and add a reference to the TypeScript type definition in your TypeScript files.

/// <reference path="ai.0.21.5-build00175.d.ts" />

To use the library, you need to configure it using a configuration snippet. The configuration snippet contains the instrumentation key that has been setup (following Will’s post):

var snippet: any = {
   config: {
      instrumentationKey: "<<your instrumentation key>>"
   }
};

You can pass this into the initialisation object :

var init = new Microsoft.ApplicationInsights.Initialization(snippet);

And then from the initialisation object you create an AppInsights instance:

var applicationInsights = init.loadAppInsights();

On the AppInsights instance you can go ahead and start to capture your telemetry using the following:

trackPageView(name?: string, url?: string, properties?: Object, measurements?: Object, duration?: number): void;
trackEvent(name: string, properties?: Object, measurements?: Object): void;
trackAjax(absoluteUrl: string, pathName: string, totalTime: number, success: boolean, resultCode: number): void;
trackException(exception: Error, handledAt?: string, properties?: Object, measurements?: Object): void;
trackMetric(name: string, average: number, sampleCount?: number, min?: number, max?: number, properties?: Object): void;
trackTrace(message: string, properties?: Object): void;

The full code would look something like this:

var snippet: any = {
   config: {
      instrumentationKey: "<<your instrumentation key>>"
   }
};

var init = new Microsoft.ApplicationInsights.Initialization(snippet);
var applicationInsights = init.loadAppInsights();
applicationInsight.trackPageView("Index");
applicationInsight.trackEvent("PageLoad");
applicationInsight.trackMetric("LoadTime", timeMeasurement);

Wednesday, February 10, 2016

Intro to VS Team Services Extensions

Over the last couple of months I have been quite busy with various VSTS extension as part of the ALM Rangers.
You can see some of the extensions that I have developed here and here, and I am currently involved with at least 3 others.

As a quick guidance we decided to do what we call a brownbag session (informal, bring your bagged lunch and listen in type of session) to try and get more of the rangers involved and up to speed.
The session was published on channel 9, so if you are interested in getting up to speed on creating your own extensions, feel free to give it a listen..

Feedback is always welcome.