Friday, November 29, 2013

ALM days : save the date…

ALM days logoALM days Cape Town is happening on 23 January 2014!

Team Foundation Consulting will be launching ALM days in Cape Town on 23rd January 2014!

Learn about updates to Microsoft’s Application Lifecycle Management offerings which will enable your software development teams to be more productive and to collaborate more effectively. This event provides insight, advice, strategies and techniques to improve quality and ensure the final application meets the needs and expectations of users.

Reserve your seat today!

Find out more at www.alm-days.co.za

The event is free and seats are limited.

Join us for an awesome day of ALM!

Wednesday, November 6, 2013

Launching…ALM days 2014 South Africa!

Finally, the time has come for us to launch the first ALM days in South Africa.

Team Foundation Consulting will be launching ALM days in Cape Town in January 2014!

Learn about updates to Microsoft’s Application Lifecycle Management offerings which will enable your software development teams to be more productive and to collaborate more effectively.

This event provides insight, advice, strategies and techniques to improve quality and ensure the final application meets the needs and expectations of users.

The event is free and seats are limited.
Date, venue and agenda to be confirmed soon.

Cape Town Event

If you are interested, sign up on www.alm-days.co.za and be first to be notified once all the details have been finalised.

We look forward to you joining us for a day focussed on Application Lifecycle Management.

Learn – Discuss – Question – Engage

Meet new & interesting people…

Monday, November 4, 2013

Introducing the Visual Studio ALM Rangers – Niel Zeeman

I have been an avid consumer of the ALM Rangers artifacts, articles and applications for a long time now.

Who are the Rangers:

“The Visual Studio ALM Rangers accelerate the adoption of Visual Studio with out-of-band solutions for feature gaps and value-add guidance for the ALM community.”

Some of the cool “stuff” that you should have seen before:

  • planning and upgrade guidance
  • branching and merging
  • and recently their DevOps tooling and guidance

    And finally I will be on the other side of the fence Smile

  • Monday, September 23, 2013

    What happened to the TFS 2013 Build Templates?

    I have had a couple people asking me “where are the build templates in TFS 2013?”.

    Well they have been removed!
    Don’t worry, they are still available though Smile

    By default, they are now simply stored in TFS and not in the repository.

    When creating a build definition you will notice that the default build templates can simply be selected (as per usual).

    image

    When you need to use a custom template, or modify the existing templates you can simple download the template,

    image

    make your changes and then upload the updated build template to the repository as per normal.

    image

    Microsoft has reworked things in the build templates, simplifying it quite a lot. Hopefully altering your build processes will no longer be such a daunting task!

    Tuesday, September 10, 2013

    Visual Studio and Team Foundation Server 2013 RC Released

    Some big news!! The release candidate for VS & TFS 2013 was released last night.

    There are a couple of really nice features added since the preview. One of my favourites is the charting ability for work items. I have done quite a few custom TFS reports, and even though this may reduce the number of jobs that I have, it is VERY cool nonetheless…

    The list of things that have been added is fairly large, hopefully I will blog about it in the future.

    Note that this does have a “go-live” license, so Microsoft will provide support if you decide to be an early adopter and upgrade your production environments.

    Feel free to contact us if you need more information around upgrade options...

    Friday, August 23, 2013

    Exam 70-498 : Delivering Continuous Value with Visual Studio 2012 Application Lifecycle Management

     

    I finally got the time to go and write the “Delivering Continuous Value with Visual Studio 2012 Application Lifecycle Management” exam.

    I was quite nervous about this one, as it has been a while since I have written any form of exam and because it is very “non technical” focussed. These “fuzzy” questions can often be very misleading.

    Luckily I passed, fairly well actually, so I thought I would jot down some of my crib notes..

    1) Know the TFS process templates

    Especially the terminology and the artefacts that are included in the different process templates. The questions are almost a matter of disqualifying the incorrect answers and then you are left with what can only be the correct ones.

    2) Have a good grasp on the methodologies / processes

    Especially scrum! The scrum guide is a fairly concise guide and small enough to read in one sitting (even for me!), so there is no real reason not to work through it in any case.

    Once again, have a good grasp of the CMMI, Agile and Scrum terminology, artefacts and processes.

    3) Read

    I really recommend “Professional Scrum Development with Microsoft Visual Studio 2012” by Richard Hundhausen. Even if you are not going to take the exam, a very good read indeed.

    Have a look online, there are a lot of brain dumps available for this exam. I would take a look at the questions, but really scrutinise the answers. I looked over a couple and there are definitely plenty of wrong answers provided to the questions. Just be careful and don't learn the answers of by heart!

    4) Work through the free jumpstart

    Yes, there is actually a free jumpstart for this exam. Definitely worth spending some time on.

     

    Done and dusted! Hot smile

    Good luck if you are going to give it a go.

    Friday, August 16, 2013

    Why move from VSS to TFS (Very Sore Safe to Truly Fantastic Server)

    Let me give you a hint: Not only is it faster, it’s also more reliable! (There, blog post done : )

    Let me expand on the above:

    It’s fast!

    Seriously, a lot faster.

    Anybody that’s ever had to sit and pay VSS tax while dreaming of your post work beer, waiting for a history lookup, a search, or especially “View difference” would know what I mean.

    There is a great difference in architecture between the two. I’ll discuss a few to give you an idea of why you should consider moving.

    Storage:

    TFS Uses a SQL database to store Team Project Collections; VSS uses a File System. So how is this better?

    · Size – (Yes it does matter) VSS can store up to 4GB reliably; TFS can go into Terabytes

    · Reliable – Ever had a network error when checking in on VSS? You’re left with corrupt files and a caffeine overdose. TFS commits a transaction which can be rolled back if there is an error

    · Indexing on the tables so faster searches – Did I mention TFS is faster?

    · And of course, having a DB as your data store, you can have all the usual goodies like mirrored and clustered DB’s for TFS, so you never have to lose anything or have any down time!

    Networking:

    TFS uses HTTP services vs. file shares (That should be enough said)

    · Option of a TFS proxy for remote sites to save bandwidth and speed things up a little

    · Did I mention that TFS is faster?

    Security:

    TFS uses Windows Role-Based Security vs. VSS security (I don’t think the methodology was good enough for someone to even come up with a name for it – I’ll just call it Stup-Id, there we go, you’re welcome ;)

    Windows Role-Based Security vs. VSS’s Stup-Id:

    · With Win Roles you can specify who’s allowed to view, check-out, check-in and lots more. With Stup-Id you can set rights per project, but all users must have the same rights for the database folder. This means all users can access and completely muck up the shared folders. Not pretty.

    Extra functionality and pure awesomeness:

    · Shelve sets – this is really handy to store code if you don’t want to commit it just yet. Say you go for lunch and you’re afraid that BigDog might chew up your hard drive again: all you do is shelve your code – this stores it in TFS. Once you’ve replaced said eaten hard drive you just unshelve and... tada! No need to say the dog ate my homework.

    · Code review – Developer A can request a code review from another developer who can add comments to the code and send it back. (Basically sharing a shelve set)

    · Gated check-ins: You can set rules to only allow check-ins when certain conditions have been met. For example, only check in code when:

    o the code builds successfully, or

    o all unit tests have passed, or

    o the code has been reviewed

    · Work Items – Bug/issue tracking made with love removes that nagging feeling at the back of your mind that one of these days there will be a PHP or MySQL update that breaks your free open source ticketing/bug tracking system.

    · Change sets – basically all the items that you’ve changed and are checking in. You can also associate change sets with work items for better issue tracking.

    · Build automation – automate build and deploys (How cool is this?)

    But for me the Pièce de résistance is:

    Have you ever had a new developer change files outside of the IDE? Maybe change the read-only attribute and made some changes? This completely confuses VSS and is a great way to get your source control out of sync. In TFS you can edit files outside of the IDE to hearts content and TFS will pick it up and queue for the next commit.

    How to move?

    Google “Visual Source Safe Upgrade Tool for Team Foundation Server” and follow the instructions.

    And that is why TFS will make you happy. Better source control means better code quality, leading to happy customers, and maybe being the next Bill Gates (unless you wanted to be Guy Fawkes).

    image

    Thursday, August 15, 2013

    The TFS Apprentice…

    Welcome Dawie...the TFS Apprentice

    Having joined TFC at the beginning of July 2013, Dawie Snyman has the un-enviable challenge of having to become an expert in all things ALM and TFS.

    Like many of my clients, TFS is completely foreign to him!

    Dawie will be contributing to our blog, offering up a new perspective on TFS... that of the 'first time user'.

    Looking forward to his insights…

    See his first post over here..

    Tuesday, August 6, 2013

    Software Deployment (Part 2)

    In the previous post I was discussing how one could go about packaging software to make the long journey from development into production.

    image_thumb2

    In this post I will take a brief look at a couple of tools or applications that I have come across, to take those packages and automate their deployment. Using them will lower the friction and reduce the reliance on human (and possibly problematic) intervention.

     

    Continuous Integration

    Once again, we all know that continuous integration is a “basic right” when it comes to development environments, but it does not need to be limited to development environments. If you are using one of the numerous CI environments, extending it to deploy the packages in the previous post should be fairly simple.

    I have done this a couple of times to varying degrees of complexity in TFS. It is possible to alter the Build Template to do pretty much anything you require. Setting up default deployment mechanisms and then by simply changing a few parameters, you can point it to different environments.

    I have done everything from database deployments, remote msi installations to SharePoint deployments using just the TFS Build to do all the work.

     

    3rd Party Deployment Agents

    InRelease

    You must have heard by now, a very exiting acquisition from Microsoft was the InRelease application from inCycle. This basically extends TFS Build and adds a deployment workflow.  It takes the build output (which could be anything that was discussed in the previous post) and, once again, kicks off a WorkFlow that includes everything from environment configuration to authorisation of deployment steps.

    In SAP they speak of “Transports” between environments, and this, in my mind, speaks to the same idea of transporting the package into different environment.

    I’m really excited about this, and I can already see a couple of my clients making extensive use of it.

    image

    Octopus Deploy

    Another deployment focussed package that I have been following is Octopus Deploy (OD). OD works on the same premise as InRelease, having agents/deployers/tentacles in the deployment environment that “does the actual work”.

    A key differentiator is that OD sources updates etc. from NuGet feeds, so you need to package your deliverables and then post them to a NuGet server. As I explained in the previous post, NuGet is a very capable platform, and with a number of free NuGet servers around, you can very easily create your “private” environment for package deployment.

    System Center

    Do not forget System Center, or more specifically System Center Configuration Manager (SCCM). SCCM is a great way to push or deploy applications (generally MSI’s) to different servers or environments. Very capable in its own right, and more importantly (assuming you have packaged the software properly) can be setup, configured and managed by the ops team.

    Thursday, July 18, 2013

    Software Deployment (Part 1)

    So, one of the tenants of ALM is automation, and probably one of the most time consuming and problem prone aspects of the lifecycle is deployment.

    It just so happens that I have been in a couple of discussions around how the gap between when the developer “completes” and what is moved into QA, UAT or Prod environments is actually bridged. In fact, I know of a notable company losing a great deal of money due to a critical system being down for a day because one step was not completed by the deployment team.

    image

    We all know that continuous integration is a “basic right” when it comes to development environments, and recently there is an emphasis on continuous deployment, but how do we package and move deliverables from one area to another?

    So I want to start with packaging.

    There are a couple of ways that we can package our applications, based on the technology that they are built in, the target environments and/or the amount of configuration that is needed when deploying.

    Remembering that “If You're Using XCopy, You're Doing It Wrong”, these are a couple of techniques / tools I would keep in mind:

    Installers (aka MSI)

    Visual Studio used to have a “Microsoft” installer project type that was available with VS 2010 and earlier. It was fairly intuitive, and I know a couple of companies that have not considered upgrading due to an investment in this project type.

    InstallShield Limited Edition was also packaged in VS 2010 and is still available with VS 2012, although limited, it does provide basic functionality and some integration with build environments.

    There are a couple of other msi creations, but my favourite has to be WiX. It was actually almost a part of Visual Studio, but then it was decided that that would limit the release cadence and inhibit the flexibility of fixing changes and adding new functionality.
    It is a highly flexible platform, plugging into builds without much effort.
    And if you are wondering if it is any good, a large number of Microsoft’s applications are actually released using a WiX based installations.

    MS Deploy / Publish

    You have had the ability to “publish” sites in Visual Studio for a while now, but it was basically a “smart” xcopy. In recent years there has been quite a bit of work done on the Web Deploy or MS Deploy utilities. It is basically an extensible framework with providers that provide certain capabilities.
    One of the more significant abilities is to deploy web site “packages” to IIS, then there is the “dbDacFx” provider that will actually deploy database changes to target servers as part of the deployment.
    This is indeed a powerful tool for the deployment arsenal, especially when working in load balanced and “highly available” environments.

    Integrating this into your build is also a synch, merely add a couple of msbuild parameters and you are a-for-away. More complex publish scenarios may involve you resorting to the “InvokeProcess” activity, but that is not so bad either.

    (Indeed, Scott Hanselman’s quote "If You're Using XCopy, You're Doing It Wrong” is a session on web deploy that he did)

    ClickOnce

    If you are working in “client deployable” environments, such as Winforms, or Wpf applications, ClickOnce should definitely be a consideration. It is once again easy to integrate in a build process and if you can get away with merely “copying” assemblies across, then you have a winner.

    NuGet

    NuGet is fast becoming the de-facto standard for packaging components in the development environment. Companies have also started adopting it in their release management due to the ease of integration and modification. It is a very simple, yet powerful way of packaging and publishing components. With configuration, and even source code transformations and PowerShell scripts, you can pretty much do anything during install or remove actions.

    Quick Comparison

      MSI MS Deploy ClickOnce NuGet
    Database Deployment bullet_tick bullet_tick bullet_cross bullet_tick
    Install Time UI bullet_tick bullet_cross bullet_cross

    (single prompt)

    bullet_cross
    Web Sites bullet_tick bullet_tick bullet_cross bullet_tick
    Windows Application bullet_tick bullet_tick bullet_tick bullet_tick
    Extensible bullet_tick bullet_tick bullet_cross bullet_tick
    Build Integration bullet_tick bullet_tick bullet_tick bullet_tick
    Automatic Updates bullet_tick bullet_cross bullet_tick bullet_cross
    GAC Installable bullet_tick bullet_tick bullet_cross bullet_tick