Showing posts with label TFS. Show all posts
Showing posts with label TFS. Show all posts

Sunday, June 1, 2014

Release Management Event in Cape Town

Getting to know Release Management…

In partnership with Microsoft SA, Team Foundation Consulting will be bringing you an afternoon focussed on Release Management.

Join us and see how to manage your deployment to create better value.

This event is free and seats are limited.

Please reserve your seat now.

Tuesday, April 1, 2014

Visual Studio Online Pricing

Seems Microsoft has finally released the pricing model for Visual Studio Online!

Edition Note Intro Price Actual Price More info
Basic 5 users free $10.00 per additional user $20.00 http://www.visualstudio.com/en-za/products/visual-studio-online-basic-vs
Professional Incl VS Pro $22.50 pp $45.00 http://www.visualstudio.com/en-za/products/visual-studio-online-professional-vs
Advanced   $30.00 pp $60.00 http://www.visualstudio.com/en-za/products/visual-studio-online-advanced-vs

The interesting thing here is that the Professional subscription actually includes a "rented" version of VS Professional for the duration of the subscription. The downside is that you can only have up to 10 pro users on your account.

For an overview go and look here: http://www.visualstudio.com/products/visual-studio-online-overview-vs

Wednesday, March 26, 2014

TFS 2013 Update 2 Gems

Just been busy installing and playing around with TFS 2013 Update 2 RC and just off the bat noticed two little gems…
1) You can now specify the TFS server's cache as part of the install. So you are able to put it on a non-system drive
image
2) Looking at the upgrade process there were a couple of hints, and I finally tracked it down in the test area… It would appear that you are - or at least will be able to share parameters across test cases -- very cool…
image
You will also notice that web access is looking a lot closer to Visual Studio Online
image
 
 
And yes, if you look close enough you may notice some indication that it is in fact a RC and some polishing for on-premise purposes may still be needed...
image

Wonder when the application insights will become available on-premise Smile

Monday, February 10, 2014

TFS Build "Machine"

The TFS Build environment can be a bit complex with a couple of things playing on each other. Let's see how everything fits together.

TFS Build

Let's start with the basic build… When you install TFS, and you "Configure Build" using the TFS Admin console, you are in fact setting up 3 components. The build service, controller and as many agents as you may need.

The build service is a communication mechanism between TFS and the build components. You need to have a service running on every build machine, whether or not that machine only has agents or contains a build controller and agents. The controller (as its name implies) controls the agents and manages which builds are being handed off to which agents. Finally, the agents preform the work.

image

The controller and or the agents can run on separate machines. So you could have an instance where TFS in on Serv1, the controller and potentially one agent is on Serv2 and another 2 agents are on Serv3. One thing to note is that a TFS Project Collection can have multiple controllers associated to it, but a controller can only service one project collection.  You can also only have one controller installed on a machine.

You may also be tempted to have multiple controllers to a team project collection, but in most cases you should only require one controller and then a number of agents spread across multiple machines.

Lab Management

Lab Management brings with it a bunch of additional resources and components. More specifically it has its own Test Controller and Test Agents.
You would use test controllers and agents when you want to "run tests remotely, distribute tests across multiple machines or run load tests".

image

Unlike the test agents in the previous section, you would only have a single test agent deployed on a machine. These components form part of the BDT (Build, Deploy and Test) workflow.

Release Management

The new kid on the block, Release Management, leverages the default build to obtain compiled artefacts. In fact it has a custom build template that actually instantiates a release pipeline in Release Management.

Recap

So to recap, TFS has the primary build (Team Build) infrastructure that consist of a build controller and build agents. This is used to compile and do initial tests on the code that is in version control. Then Lab Management with its test controllers and test agents takes this a step further and allows some deployment workflows and once again adds test capabilities.

Friday, November 29, 2013

ALM days : save the date…

ALM days logoALM days Cape Town is happening on 23 January 2014!

Team Foundation Consulting will be launching ALM days in Cape Town on 23rd January 2014!

Learn about updates to Microsoft’s Application Lifecycle Management offerings which will enable your software development teams to be more productive and to collaborate more effectively. This event provides insight, advice, strategies and techniques to improve quality and ensure the final application meets the needs and expectations of users.

Reserve your seat today!

Find out more at www.alm-days.co.za

The event is free and seats are limited.

Join us for an awesome day of ALM!

Monday, September 23, 2013

What happened to the TFS 2013 Build Templates?

I have had a couple people asking me “where are the build templates in TFS 2013?”.

Well they have been removed!
Don’t worry, they are still available though Smile

By default, they are now simply stored in TFS and not in the repository.

When creating a build definition you will notice that the default build templates can simply be selected (as per usual).

image

When you need to use a custom template, or modify the existing templates you can simple download the template,

image

make your changes and then upload the updated build template to the repository as per normal.

image

Microsoft has reworked things in the build templates, simplifying it quite a lot. Hopefully altering your build processes will no longer be such a daunting task!

Friday, August 16, 2013

Why move from VSS to TFS (Very Sore Safe to Truly Fantastic Server)

Let me give you a hint: Not only is it faster, it’s also more reliable! (There, blog post done : )

Let me expand on the above:

It’s fast!

Seriously, a lot faster.

Anybody that’s ever had to sit and pay VSS tax while dreaming of your post work beer, waiting for a history lookup, a search, or especially “View difference” would know what I mean.

There is a great difference in architecture between the two. I’ll discuss a few to give you an idea of why you should consider moving.

Storage:

TFS Uses a SQL database to store Team Project Collections; VSS uses a File System. So how is this better?

· Size – (Yes it does matter) VSS can store up to 4GB reliably; TFS can go into Terabytes

· Reliable – Ever had a network error when checking in on VSS? You’re left with corrupt files and a caffeine overdose. TFS commits a transaction which can be rolled back if there is an error

· Indexing on the tables so faster searches – Did I mention TFS is faster?

· And of course, having a DB as your data store, you can have all the usual goodies like mirrored and clustered DB’s for TFS, so you never have to lose anything or have any down time!

Networking:

TFS uses HTTP services vs. file shares (That should be enough said)

· Option of a TFS proxy for remote sites to save bandwidth and speed things up a little

· Did I mention that TFS is faster?

Security:

TFS uses Windows Role-Based Security vs. VSS security (I don’t think the methodology was good enough for someone to even come up with a name for it – I’ll just call it Stup-Id, there we go, you’re welcome ;)

Windows Role-Based Security vs. VSS’s Stup-Id:

· With Win Roles you can specify who’s allowed to view, check-out, check-in and lots more. With Stup-Id you can set rights per project, but all users must have the same rights for the database folder. This means all users can access and completely muck up the shared folders. Not pretty.

Extra functionality and pure awesomeness:

· Shelve sets – this is really handy to store code if you don’t want to commit it just yet. Say you go for lunch and you’re afraid that BigDog might chew up your hard drive again: all you do is shelve your code – this stores it in TFS. Once you’ve replaced said eaten hard drive you just unshelve and... tada! No need to say the dog ate my homework.

· Code review – Developer A can request a code review from another developer who can add comments to the code and send it back. (Basically sharing a shelve set)

· Gated check-ins: You can set rules to only allow check-ins when certain conditions have been met. For example, only check in code when:

o the code builds successfully, or

o all unit tests have passed, or

o the code has been reviewed

· Work Items – Bug/issue tracking made with love removes that nagging feeling at the back of your mind that one of these days there will be a PHP or MySQL update that breaks your free open source ticketing/bug tracking system.

· Change sets – basically all the items that you’ve changed and are checking in. You can also associate change sets with work items for better issue tracking.

· Build automation – automate build and deploys (How cool is this?)

But for me the Pièce de résistance is:

Have you ever had a new developer change files outside of the IDE? Maybe change the read-only attribute and made some changes? This completely confuses VSS and is a great way to get your source control out of sync. In TFS you can edit files outside of the IDE to hearts content and TFS will pick it up and queue for the next commit.

How to move?

Google “Visual Source Safe Upgrade Tool for Team Foundation Server” and follow the instructions.

And that is why TFS will make you happy. Better source control means better code quality, leading to happy customers, and maybe being the next Bill Gates (unless you wanted to be Guy Fawkes).

image

Tuesday, August 6, 2013

Software Deployment (Part 2)

In the previous post I was discussing how one could go about packaging software to make the long journey from development into production.

image_thumb2

In this post I will take a brief look at a couple of tools or applications that I have come across, to take those packages and automate their deployment. Using them will lower the friction and reduce the reliance on human (and possibly problematic) intervention.

 

Continuous Integration

Once again, we all know that continuous integration is a “basic right” when it comes to development environments, but it does not need to be limited to development environments. If you are using one of the numerous CI environments, extending it to deploy the packages in the previous post should be fairly simple.

I have done this a couple of times to varying degrees of complexity in TFS. It is possible to alter the Build Template to do pretty much anything you require. Setting up default deployment mechanisms and then by simply changing a few parameters, you can point it to different environments.

I have done everything from database deployments, remote msi installations to SharePoint deployments using just the TFS Build to do all the work.

 

3rd Party Deployment Agents

InRelease

You must have heard by now, a very exiting acquisition from Microsoft was the InRelease application from inCycle. This basically extends TFS Build and adds a deployment workflow.  It takes the build output (which could be anything that was discussed in the previous post) and, once again, kicks off a WorkFlow that includes everything from environment configuration to authorisation of deployment steps.

In SAP they speak of “Transports” between environments, and this, in my mind, speaks to the same idea of transporting the package into different environment.

I’m really excited about this, and I can already see a couple of my clients making extensive use of it.

image

Octopus Deploy

Another deployment focussed package that I have been following is Octopus Deploy (OD). OD works on the same premise as InRelease, having agents/deployers/tentacles in the deployment environment that “does the actual work”.

A key differentiator is that OD sources updates etc. from NuGet feeds, so you need to package your deliverables and then post them to a NuGet server. As I explained in the previous post, NuGet is a very capable platform, and with a number of free NuGet servers around, you can very easily create your “private” environment for package deployment.

System Center

Do not forget System Center, or more specifically System Center Configuration Manager (SCCM). SCCM is a great way to push or deploy applications (generally MSI’s) to different servers or environments. Very capable in its own right, and more importantly (assuming you have packaged the software properly) can be setup, configured and managed by the ops team.

Thursday, July 18, 2013

Software Deployment (Part 1)

So, one of the tenants of ALM is automation, and probably one of the most time consuming and problem prone aspects of the lifecycle is deployment.

It just so happens that I have been in a couple of discussions around how the gap between when the developer “completes” and what is moved into QA, UAT or Prod environments is actually bridged. In fact, I know of a notable company losing a great deal of money due to a critical system being down for a day because one step was not completed by the deployment team.

image

We all know that continuous integration is a “basic right” when it comes to development environments, and recently there is an emphasis on continuous deployment, but how do we package and move deliverables from one area to another?

So I want to start with packaging.

There are a couple of ways that we can package our applications, based on the technology that they are built in, the target environments and/or the amount of configuration that is needed when deploying.

Remembering that “If You're Using XCopy, You're Doing It Wrong”, these are a couple of techniques / tools I would keep in mind:

Installers (aka MSI)

Visual Studio used to have a “Microsoft” installer project type that was available with VS 2010 and earlier. It was fairly intuitive, and I know a couple of companies that have not considered upgrading due to an investment in this project type.

InstallShield Limited Edition was also packaged in VS 2010 and is still available with VS 2012, although limited, it does provide basic functionality and some integration with build environments.

There are a couple of other msi creations, but my favourite has to be WiX. It was actually almost a part of Visual Studio, but then it was decided that that would limit the release cadence and inhibit the flexibility of fixing changes and adding new functionality.
It is a highly flexible platform, plugging into builds without much effort.
And if you are wondering if it is any good, a large number of Microsoft’s applications are actually released using a WiX based installations.

MS Deploy / Publish

You have had the ability to “publish” sites in Visual Studio for a while now, but it was basically a “smart” xcopy. In recent years there has been quite a bit of work done on the Web Deploy or MS Deploy utilities. It is basically an extensible framework with providers that provide certain capabilities.
One of the more significant abilities is to deploy web site “packages” to IIS, then there is the “dbDacFx” provider that will actually deploy database changes to target servers as part of the deployment.
This is indeed a powerful tool for the deployment arsenal, especially when working in load balanced and “highly available” environments.

Integrating this into your build is also a synch, merely add a couple of msbuild parameters and you are a-for-away. More complex publish scenarios may involve you resorting to the “InvokeProcess” activity, but that is not so bad either.

(Indeed, Scott Hanselman’s quote "If You're Using XCopy, You're Doing It Wrong” is a session on web deploy that he did)

ClickOnce

If you are working in “client deployable” environments, such as Winforms, or Wpf applications, ClickOnce should definitely be a consideration. It is once again easy to integrate in a build process and if you can get away with merely “copying” assemblies across, then you have a winner.

NuGet

NuGet is fast becoming the de-facto standard for packaging components in the development environment. Companies have also started adopting it in their release management due to the ease of integration and modification. It is a very simple, yet powerful way of packaging and publishing components. With configuration, and even source code transformations and PowerShell scripts, you can pretty much do anything during install or remove actions.

Quick Comparison

  MSI MS Deploy ClickOnce NuGet
Database Deployment bullet_tick bullet_tick bullet_cross bullet_tick
Install Time UI bullet_tick bullet_cross bullet_cross

(single prompt)

bullet_cross
Web Sites bullet_tick bullet_tick bullet_cross bullet_tick
Windows Application bullet_tick bullet_tick bullet_tick bullet_tick
Extensible bullet_tick bullet_tick bullet_cross bullet_tick
Build Integration bullet_tick bullet_tick bullet_tick bullet_tick
Automatic Updates bullet_tick bullet_cross bullet_tick bullet_cross
GAC Installable bullet_tick bullet_tick bullet_cross bullet_tick