Tuesday, August 6, 2013

Software Deployment (Part 2)

In the previous post I was discussing how one could go about packaging software to make the long journey from development into production.

image_thumb2

In this post I will take a brief look at a couple of tools or applications that I have come across, to take those packages and automate their deployment. Using them will lower the friction and reduce the reliance on human (and possibly problematic) intervention.

 

Continuous Integration

Once again, we all know that continuous integration is a “basic right” when it comes to development environments, but it does not need to be limited to development environments. If you are using one of the numerous CI environments, extending it to deploy the packages in the previous post should be fairly simple.

I have done this a couple of times to varying degrees of complexity in TFS. It is possible to alter the Build Template to do pretty much anything you require. Setting up default deployment mechanisms and then by simply changing a few parameters, you can point it to different environments.

I have done everything from database deployments, remote msi installations to SharePoint deployments using just the TFS Build to do all the work.

 

3rd Party Deployment Agents

InRelease

You must have heard by now, a very exiting acquisition from Microsoft was the InRelease application from inCycle. This basically extends TFS Build and adds a deployment workflow.  It takes the build output (which could be anything that was discussed in the previous post) and, once again, kicks off a WorkFlow that includes everything from environment configuration to authorisation of deployment steps.

In SAP they speak of “Transports” between environments, and this, in my mind, speaks to the same idea of transporting the package into different environment.

I’m really excited about this, and I can already see a couple of my clients making extensive use of it.

image

Octopus Deploy

Another deployment focussed package that I have been following is Octopus Deploy (OD). OD works on the same premise as InRelease, having agents/deployers/tentacles in the deployment environment that “does the actual work”.

A key differentiator is that OD sources updates etc. from NuGet feeds, so you need to package your deliverables and then post them to a NuGet server. As I explained in the previous post, NuGet is a very capable platform, and with a number of free NuGet servers around, you can very easily create your “private” environment for package deployment.

System Center

Do not forget System Center, or more specifically System Center Configuration Manager (SCCM). SCCM is a great way to push or deploy applications (generally MSI’s) to different servers or environments. Very capable in its own right, and more importantly (assuming you have packaged the software properly) can be setup, configured and managed by the ops team.

Thursday, July 18, 2013

Software Deployment (Part 1)

So, one of the tenants of ALM is automation, and probably one of the most time consuming and problem prone aspects of the lifecycle is deployment.

It just so happens that I have been in a couple of discussions around how the gap between when the developer “completes” and what is moved into QA, UAT or Prod environments is actually bridged. In fact, I know of a notable company losing a great deal of money due to a critical system being down for a day because one step was not completed by the deployment team.

image

We all know that continuous integration is a “basic right” when it comes to development environments, and recently there is an emphasis on continuous deployment, but how do we package and move deliverables from one area to another?

So I want to start with packaging.

There are a couple of ways that we can package our applications, based on the technology that they are built in, the target environments and/or the amount of configuration that is needed when deploying.

Remembering that “If You're Using XCopy, You're Doing It Wrong”, these are a couple of techniques / tools I would keep in mind:

Installers (aka MSI)

Visual Studio used to have a “Microsoft” installer project type that was available with VS 2010 and earlier. It was fairly intuitive, and I know a couple of companies that have not considered upgrading due to an investment in this project type.

InstallShield Limited Edition was also packaged in VS 2010 and is still available with VS 2012, although limited, it does provide basic functionality and some integration with build environments.

There are a couple of other msi creations, but my favourite has to be WiX. It was actually almost a part of Visual Studio, but then it was decided that that would limit the release cadence and inhibit the flexibility of fixing changes and adding new functionality.
It is a highly flexible platform, plugging into builds without much effort.
And if you are wondering if it is any good, a large number of Microsoft’s applications are actually released using a WiX based installations.

MS Deploy / Publish

You have had the ability to “publish” sites in Visual Studio for a while now, but it was basically a “smart” xcopy. In recent years there has been quite a bit of work done on the Web Deploy or MS Deploy utilities. It is basically an extensible framework with providers that provide certain capabilities.
One of the more significant abilities is to deploy web site “packages” to IIS, then there is the “dbDacFx” provider that will actually deploy database changes to target servers as part of the deployment.
This is indeed a powerful tool for the deployment arsenal, especially when working in load balanced and “highly available” environments.

Integrating this into your build is also a synch, merely add a couple of msbuild parameters and you are a-for-away. More complex publish scenarios may involve you resorting to the “InvokeProcess” activity, but that is not so bad either.

(Indeed, Scott Hanselman’s quote "If You're Using XCopy, You're Doing It Wrong” is a session on web deploy that he did)

ClickOnce

If you are working in “client deployable” environments, such as Winforms, or Wpf applications, ClickOnce should definitely be a consideration. It is once again easy to integrate in a build process and if you can get away with merely “copying” assemblies across, then you have a winner.

NuGet

NuGet is fast becoming the de-facto standard for packaging components in the development environment. Companies have also started adopting it in their release management due to the ease of integration and modification. It is a very simple, yet powerful way of packaging and publishing components. With configuration, and even source code transformations and PowerShell scripts, you can pretty much do anything during install or remove actions.

Quick Comparison

  MSI MS Deploy ClickOnce NuGet
Database Deployment bullet_tick bullet_tick bullet_cross bullet_tick
Install Time UI bullet_tick bullet_cross bullet_cross

(single prompt)

bullet_cross
Web Sites bullet_tick bullet_tick bullet_cross bullet_tick
Windows Application bullet_tick bullet_tick bullet_tick bullet_tick
Extensible bullet_tick bullet_tick bullet_cross bullet_tick
Build Integration bullet_tick bullet_tick bullet_tick bullet_tick
Automatic Updates bullet_tick bullet_cross bullet_tick bullet_cross
GAC Installable bullet_tick bullet_tick bullet_cross bullet_tick

Friday, June 7, 2013

Upgrade from TFS 2010 to TFS 2012

Even though there is talk of TFS 2013, with the first CTP due towards the end of June, I’m still doing quite a few upgrades to TFS 2012 at the moment.

When considering upgrading to TFS 2012, there is basically one of two approaches, namely an in place upgrade or a migration.

The in place upgrade is probably a bit less “complex”, even though it can be quite involved. You need to get the environment to a TFS 2012 friendly state (note those service packs!) and then do the upgrade which can take some time.

I highly recommend doing a test upgrade first, work out and document all the kinks before you shut down and attempt a production upgrade.
The test will give you an indication of what can happen before you start, and how long the upgrade will actually take.
With an in place upgrade there is not much you can do to reduce the amount of time that the upgrade will take, you would need to go through all the steps, which will involve uninstalling TFS 2010 and installing TFS 2012 and then upgrading the databases.

You do gain some time in that you would not need to change the SharePoint or reporting services links.

Personally I prefer doing a migration as apposed to the in place upgrade. This means that I can spend the time and configure a new “fresh” environment, then take a TFS 2010 database across and make sure everything works. Any problems can be sorted out in the new environment and you most probably won’t need to keep an extensive log to remind yourself of the problems during the actual upgrade.

On D-Day you just need to take over the TFS 2010 database, and do the post configuration steps. This will also retain the TFS 2010 environment in case anything goes wrong.

Some things that you need to take note of with the migration are:

1) Rebind SharePoint

Have you opened up the project’s SharePoint portal and it has red block strewn all over it?

image

I have found that “Repair Connection” will in most cases sort out the links adequately. If however you have not stuck to the default SharePoint sites or you use sites that are located on different servers, you would need to manually “correct” these bindings.

Open up Visual Studio, go to Team explorer Settings and open up the “Portal Settings”.

image

Re-establishing the connection here works as a last resort (for example uncheck the “Reports and dashboards…” checkbox , click OK , open it back up and then select it, click OK again ) .

2) Reporting Services

If you are considering moving SSRS there are a couple of steps that you need to take into consideration:

  • Restore both the ReportServer and ReportServerTempDB on the target server
  • Make a backup of the SSRS encryption key from the source server and then restore it on the target server
  • In Reporting Services Configuration Manager, make sure that only the target server is in the “Scale-Out Deployment” section. If the old server is still there, remove it.
    • If it won’t simply remove by selecting it and clicking “Remove Server”, then open up the ReportServer database and remove it from the Keys table
  • Make sure that the top level security ( in http://<report server>/Reports ) is set adequately
  • Make sure that your TFS data sources ( “Tfs2010OlapReportDS” and “Tfs2010ReportDS”) points to the correct server
  • And finally, in the TFS Admin console, make sure that on the Reporting view, you are pointing to the correct SSRS server

3) Build Template

Make sure that your build templates have been updated correctly and any custom activities are referencing the new TFS 2012 assemblies.

4) Finally

Upgrade the project templates from TFS web access admin site and setup the team members.

 

Depending on the hardware that you are doing the upgrade on a 15GB TFS database can take 40 minutes to an hour and a 45GB database can take in the vicinity of 3 hours.

If you are looking to upgrade TFS to TFS 2012, feel free to give us a shout

Friday, May 17, 2013

Quality Management with Visual Studio

I have had a couple of discussions around the various aspects of managing quality, I even have a section dedicated to quality management in the training that I offer.

I like to break down quality management as follows:

We all (should) know about the Agile testing quadrant that was initially discussed by Brian Marick and then used to form the basis of Lisa Crispin’s book on Agile Testing.

Agile Testing Quadrants

A lot of people have asked me “where do I start”?

I want to take this and break it down into a practical, technical approach.

image

The basics or the bottom of the pyramid is about inspection. Does the code look right?

This is the easiest accomplish. The tools are already built into Visual Studio and it is a matter of a few clicks to get the results. You can make this a part of the review process or even automate it using TFS Build to produce these reports every time someone checks in code.

The next step is t verify or test that the functionality of the application / code works as expected. There are a couple of methodologies (TDD, ATDD etc) that you could look to to get the unit testing in place. A personal favourite are the SOLID principles with regards to how to go about designing the application to be test friendly. Overall, my advice is just start.

Next is a subtle, yet important step. Even if it is just to separate the concept of a Unit vs an Integration test. The unit test focuses on the smallest piece of code in isolation. The integration test will combine these units together and see how they perform in unison.

UI Testing can be quite difficult in many organisations. They tend to be fairly brittle and need maintenance (which puts a lot of people off).
I like the approach of having manual test cases executed during the sprint or iteration (using something like Microsoft Test Manager ), and then in the next sprint or when the functionality has stabilised a bit, decomposing those to Coded UI tests for eventual automated execution.

Finally, load and performance testing. This usually only happens when there are problems or when a big client requests the stats, right !?
Including this type of testing is very much dependent on the application and when/where the application is used. A simple 5 user, limited data, desktop application has much less of a performance requirement than, for example, a large, customer facing ecommerce site. So consider this when and where it is appropriate.

So from basic inspection to performance and load testing, it is a journey that you need to travel. Using tools (such as Visual Studio) that incorporate this functionality is a big help. It merely becomes a matter of getting to grips with what is already there for you to use. Combining this with TFS to automate the tests when ever a change is made or even on a scheduled build, will greatly increase quality, flexibility and agility.

Friday, March 8, 2013

Run Application through Coded UI Tests

I have always liked the idea of using automated testing as a part of a full testing strategy. It makes sense to automate as much as possible to exercise the application thoroughly and repeatedly.

Obviously if you can run these automations in a scheduled, automated build( using TFS) this is better suited to increase quality all round.

Here are some tips that I wanted to share after getting a Coded UI strategy in place recently.

The high level steps in setting up and executing the automations can be broken down into:

  1. Configure TFS Build to execute automated tests
  2. Deploy all the assemblies and files that is required to run the application
  3. Setup tests
  4. Configure Build definition
  5. Execute the build and watch magic happen

Configure TFS Build

Remember that automation tests is exactly that, it automates the end user’s actions on the application. This means that it will run the application and perform clicks and type text in text boxes “as a user does”. This means that you would need to have the application run in the desktop.
How do you do that in an automated build you may ask?

Simple: when configuring the TFS Build Service it is as simple as checking a checkbox.

image

Deploy assemblies

This can get a bit involved. So let’s assume that you need to have the application run and you need files copied to the application directory, but under their own directory.
Something like this:

image

 

 

 

Deploying files can occur in one of two ways:

  1. Use “DeploymentItem” attribute on the tests or
  2. using a testsettings file

Even though Microsoft recommends using the “DeploymentItem” attribute for performance reasons, I had enough hassles with it in the CodedUI test cases to abandon it and rather opt for the test settings file.

The test settings file is also a bit more involved than one would expect, but it is manageable.

  1. Add a test settings file to the solution if you do not already have one
    image
  2. Open the test setting dialog and navigate to the “Deployment” tab
    image
  3. Enable deployment and add the files and folders that are needed to execute your tests
    image
  4. If you had to execute the test now, you may notice that everything in those folders would just be included into the deployment directory. It does not replicate the folder structure automatically
  5. Apply the changes in this dialog and then open up the test settings file in notepad or an XML editor
    image
  6. Edit the “DeploymentItem” nodes and add a “outputDirectory” attribute like this:
    image
    This will cause the folder structure to be maintained in the deployment folder
  7. Finally, select this as the default test settings in Visual Studio ( “Test->Test Settings->Select Test settings File” )

Setup tests

You can go ahead and start recording the tests now.
Obviously you can record the steps to execute the application and then kick off the automation or you can edit the test and use “ApplicationUnderTest.Launch("application.exe");”.

Configure Build Definition

Create a standard build definition using Team Explorer but when defining automated tests in the Process tab remember to use the test settings file that you created and committed to version control
image

 

Now all that is left is to execute the build and watch the magic happen.