Friday, 2 October 2015

Problem solved! Git-LFS and VSO

One of the problems you might face with Git is the performance hit when you start storing binaries into the Version Control.

It is actually by design – Git takes content snapshots at every commit, and it can’t handle deltas on binary files like on text files.

Fair enough, it has lots of other pros (and cons too), but the inability of storing binaries in an easy and non-disruptive way hampered the need for a shared tool in an heterogeneous development teams. If you have people working on the UI of your applications and you want them to version the .psd files they use, you can’t use Git.

Well, you couldn’t. GitHub developed Git Large File Storage for sorting this issue, but it wasn’t by default and eventually you would need to pay for usage (over 1GB quota of free space and bandwidth). It isn’t a out-of-the-box solution.

But yesterday Microsoft announced Git-LFS support on VSO, and it is a game changer to me.

Firstly, it is enabled by default on all the Visual Studio Online’s Git repositories. What you need to do is to install the Git-LFS extension, and nothing else.

Then, it is free for unlimited storage, so you don’t have to worry about limits, quotas and usage. It’s there, use it (if it makes sense, obviously).

Eventually, it will be included in Team Foundation Server 2015 Update 1, meaning that you will get exactly the same experience on-premise.

That’s marvellous, really. It solves the aforementioned issue in an effortless and easy way, making Git even more approachable.

Monday, 28 September 2015

Personal Access Tokens in Visual Studio Online

When you try to access some services in Visual Studio Online, you might need to enter your Alternate Credentials. Think about Git, for example.

This approach works, no questions about it. But in terms of security it isn’t the best choice. It isn’t granular at all and credentials have no expiry date.

But Visual Studio Online also provides Personal Access Tokens, to fix this inconvenience. A Personal Access Token offers better granularity and expiration management:


And how to use it? You need to safely store it (you can’t access them after the creation, by design), and then you can use the string in place of the password when asked.
The username in that case can be whatever, it is just not used.


Tuesday, 15 September 2015

Why shall I split the columns in the Kanban Board?

A very interesting discussion came out at the SmartDevsUG meeting last night about the Kanban Split Columns in TFS and VSO. What is the real rationale behind it in a real world project? Why should I have a Resolved column split in Doing and Done, for example?


Without taking into consideration the Kanban principles behind this tool, I have an easier explanation. If you step back from the technical stuff for a moment and you think about the business side of it, it is pretty clear.

For example, you are working on a customer hotfix for your product – the code is done, the fix is tested thoroughly and you are technically done. But you aren’t completely done – did you deliver the hotfix to the customer? Or from a different perspective, is it billed? Until this hurdle isn’t cleared, you are in a Resolved-Done situation.

Thursday, 10 September 2015

Reading the TFS update logs

Did you ever read a TFS update log? If not, you should do it.

Once you do anything on the TFS databases, all the operations are logged, obviously. You can reach these logs from the Administration Console:



The amount of data in there is really helpful – for example, each Stored Procedure used by the installer is detailed, giving you an idea of what happens to your databases during each update:


And you also get a recap of how long it took with each individual timing:


That is interesting enough if everything works, but what about when something fails?

Well, that’s even better, because if you experience an error whatsoever, you will get a detailed exception stacktrace in the logs:


This gives you a real actionable insight before touching anything or raising the phone to call CSS.

Friday, 28 August 2015

Draft Builds in the new Team Build

With the new Team Build you have the possibility of working on Build Definitions in an asynchronous fashion, without impacting other team members or wasting time cloning the existing build.

For example, you might want to add some tasks to an existing Build Definition – you can then Save as a draft..


You are actually creating a clone of the existing Build Definition with the new steps you added. You can even run it!


All the builds you run with this definition are marked with a .DRAFT in the name:


This is not different compared to any other build – it is just not affecting the existing published builds:

image image

Then once you save it everything is merged with the original definition, with no further effort.

Tuesday, 25 August 2015

Quickly change the TFS Integration Platform settings

You installed the TFS Integration Platform, and now you want to move your database. Or you want to change the temporary folder used during the migrations.

Shall you waste time reinstalling it?

No, what you need to do is to modify some strings in the MigrationToolServers.config file:


Thursday, 13 August 2015

Pre-upgrading a large Team Project Collection to TFS 2015

TFS 2015 is out and we all rush upgrading our Team Project Collections!

But if you have a collection bigger than 1TB you should stand back and read some documentation before that. This page in particular explains what you should do if you want to upgrade such a huge collection.

The reason for using TfsPreUpgrade is very simple – you don’t want your users stuck for days while you upgrade Team Foundation Server. So the tool is going to partially upgrade the schema while the server is still available, so the actual upgrade (the one done by the installer) takes no longer than two days, which means a weekend, which also means no perceivable service interruptions.

You can run TfsPreUpgrade from whatever machine you like. In my testing environment I am using at the moment I am running it from a plain Virtual Machine which is going to be used by the Application Tier, but it has nothing installed yet. You basically need access to SQL Server, and that’s it.

First thing, run it with the Estimate switch. It will give you an idea of the time and space needed for the upgrade:


To give some context, this is data for a 2TB Collection. A few hiccups while running the Run switch might happen, especially with a temporary test environment:


but you can just relaunch TfsPreUpgrade and it would start again from the same failed point after a quick check. It is smart enough to not further expand your databases if you don’t need it:


Once it is done your offline upgrade will be much faster, because these operations were carried in advance!