Thursday, October 11, 2018

TFS Reporting using Power BI and REST services





I've been getting the question around reporting options in TFS, specifically on-premises, fairly often.  Currently we have a few options, using queries and charts or widgets, try to use the warehouse / analysis services setup or wait for the Analytics services to come on-prem. A few clients need more details than the queries can offer and do not want to spend the overhead to configure and run the warehouse but also do not want to wait indefinitely for Analytics to make the on-premises appearance.
I have been doing some high level stuff with Power BI Desktop and though I would give it a go. The problem I faced is that there is not a lot of material around how to do it. So let's see if we can help out a bit.

Firstly you would need to get Power BI Desktop.
After you have created a new empty report you need to decide what information you are looking for. As you know, you can get a lot of detail from the TFS/Azure DevOps REST services. This post should give you the basics to get started and report on information derived from pretty much any one of those sources.
After you have decided what information you are looking for, we can start. The example I'm going to use is going to be historic information around work history. That is a particularly cruel thing to get out of TFS and report on.
To start, we need to go and add the queries we need. To do this we will "Edit Queries".
image
In the Query Editor, we create a new blank source.
image
Give it a name and then select the advanced editor.
image
In the advanced editor we will create a "function" that will take in the TFS server URL, the Team Project, and in this case a start date as parameters.
The start date is when we start retrieving the history from.
The "source" will look something like this:
image
Line 4: This is where the REST service is being invoked and the resulting JSON is returned
Line 5: Next we need to extract the "values" element, which is an array, from the resulting JSON
Line 6: Convert the JSON array to a table with records, each record still contains a JSON like record object
Line 7: The object, in this case, looks something like this image We want to see more detail, so we will expand the "fields" property into a fields column
Line 8: If you look at the result now, it would like this image. A "fields" column with rows of "records", simply meaning we need to expand even further to extract the columns that we are looking for. You can see I have specified the columns and their columns headings in the Table.ExpandRecordColumn function
Line 9 and 10: Just doing some data type conversions on the columns that will make the report behave appropriately (allow summing or average on number fields for example)
Because this is a "function" we have just created, we can invoke it and pass in the correct parameters. If you click "done" on the advanced editor you should see your "query" in the left had pane.image Selecting it will prompt you for parameters that you can enter and then after clicking "Invoke" you should see some results
image
You can then rename and use that new table to create the best looking reports ever by clicking "Apply and Close" and start creating reports and charts etc…
Just as an example, the attached Power BI report has the work item history that we have just covered, as well as commit history on GIT repos. You may notice that the initial JSON format is slightly different, but the concepts are still the same.

Get the sample here


http://www.teamfoundation.co.za



Wednesday, June 27, 2018

Team Foundation Server Anti-Virus Exclusions

A while back I had big problems with a client's on-premise TFS environment suddenly running the CPU to near 100% for 2 – 5 minutes at a time.  This slowed down the entire system and caused TFS to slow down to the extent that developers could not accomplish much on TFS. This happened intermittently and we took a few weeks to pin it down to the anti-virus. It seems TFS uses a local cache, and when you check out large repos (especially GIT repositories) it does a lot of churning and caching. The anti-virus insists on scanning each action and it starts to slow down the entire process.

I searched online and found some guidance around anti-virus exclusions, but I decided to add to it with a few of my opinions included of course.


Team Foundation Server

There are primarily 3 things that need to be excluded on the TFS server itself.

  1. W3wp.exe process. This is the Internet Information Services (IIS) worker process and is typically located here : C:\Windows\System32\inetsrv\w3wp.exe
  2. TFSJobAgent.exe process. This is the TFS background worker process and is typically located under the TFS install path under the "\Application Tier\TFSJobAgent" folder
  3. TFS file cache location. This is where we had issues. TFS does a lot of caching on the server, and uses this location to write it to disk. It is one of the configurations that is asked for during the install and you can see what it is if you open the TFS administration console under the "File Cache Location" property. The default location is "C:\TfsData\ApplicationTier\_fileCache". You want to exclude on access scanning for this location.

TFS Build Machine

The TFS build machine may also take a bit of strain when it comes to the agent's working directory. This may be a contentious discussion, because it is possible to have malicious code executed under this location. In your secure environment it may increase your build process slightly.

SQL

When the preceding exclusions are made, and you want to move to SQL, there is a pretty decent overview of what needs to be excluded here.


After we added these exclusions the TFS servers were humming along without any issues.


Monday, May 7, 2018

Linux and Windows event more love with containers (LCOW)

I have noticed "Linux containers on Windows built with LinuxKit" (LCOW)  a while back. Today I found a blog post from briantweet actually running a Windows and Linux container together.
There are a few steps that you need to follow to get things working though.
  1. You need to be working on a later release of docker (at least 18.03 I believe)
  2. Switch to windows containers:
    image
  3. and have the experimental features enabled :
    image
  4. You need to then install the LinuxKit on your machine, which is luckily fairly simple:
    1. Download the latest release
    2. Run an elevated powershell prompt and execute the following:
      Remove-Item "$env:ProgramFiles\Linux Containers" -Force -Recurse
      Expand-Archive release.zip -DestinationPath "$Env:ProgramFiles\Linux Containers\."
  5. Now, the first thing I tried was to run the MS-SQL container on Linux (as did brian) and we both ran into the same memory issue. It is a bit hairy to fix, and you may not even need to do this depending on what you are trying to run, but just in case:
    1. We need to recompile the docker deamon. Luckily there are steps to do exactly that, in a docker container. Brian explains it quite well, but missed a few steps that I will fill in:
      1. You will find the client_local_windows.go file in the libcontainerd directory under the root of the docker source code
      2. You will need to find the configuration := &hcsshim.ContainerConfig{
        HvPartition: true,
        Name: id,
        SystemType: "container",
        ContainerType: "linux",
        Owner: defaultOwner,
        TerminateOnLastHandleClosed: true,
        }
        section, round about line 360. Then insert the MemoryMaximumInMB: 4096, in that config
      3. When running step 4 (from the steps) to build the images, add the "-m 2GB" arg., else you will have an "out of memory" error
      4. Finally, only copy over the "dockerd.exe" file to the original docker install folder under the resources directory. On my machine : C:\Program Files\Docker\Docker\resources
This allowed me to run both the Linux ms-sql image and a windows based dotnet sample image together..
image
Magic!