Migrate TFS to VSTS

Migrate your TFS instance to a VSTS account without any data/history loss (high fidelity)

by | Apr 19, 2017 | VSTS and TFS

Introduction

This post will describe the steps that are required to do a TFS to VSTS migration. In this scenario no data will get lost and all history (workitems and sources) will be present in the VSTS account. Please note that this type of migration (and therefore some tools that are required) is still in preview. Screenshots and steps may differ from the tooling that is currently available.

 

1. Getting started

To migrate from TFS to VSTS some prerequisites need to be in place.

  • Make sure TFS is on the right version. Only the current and previous version of TFS are supported to migrate to VSTS.
  • Make sure your SQL database uses one of the following collations:
    • o SQL_Latin1_General_CP1_CI_AS
    • o Latin1_General_CI_AS
  • Make sure your identities are synced to an Azure Active Directory. VSTS is not using the local active directory; it will use the Azure Active Directory.
  • Make sure that MSDN licenses are assigned to an email address that corresponds with a user’s email address in the Azure Active Directory.
  • Make sure you have a migration key. You can get two (one for a dry-run, one for production) by filling in the following questionnaire: https://aka.ms/vstsdataimportpreviewform. Test the migration first with the dry run key; if everything ends up successfully, you can use the production key.
  • This migration is intended for collections smaller than 150 gigabytes.

2. Validate

In this step the TFS collection gets validated to migrate to VSTS. Not all configurations are suitable for migration. In some cases you need to adjust the currect configuration of TFS (think about process templates, permissions etc.). To validate your collection download the migration tool here, and run the following command:

Tfsmigrator.exe validate /collection:http://localhost:8080/tfs/defaultcollection

  • Make sure that the collection URL is pointing to the TFS collection you want to migrate.

This command will result will end with:

Collection validation was successful. Recommended next steps: …

If the validation did not end up successful; you can check the log files that are located logs folder next to tfsmigrator.exe. In this log you will find the errors why your TFS collection is not valid. Most likely you must change some settings in the process template of your projects or some rights are not setup successfully. After you have resolved the errors, rerun the validation again.

3 Prepare

After the validation went successful, it is time to prepare the environment. In this step, we will generate the files that are needed by import tool to import your collection. One of the files will contain the configuration for the import, the other contains a mapping of local identities and azure active directory identities. The files can be generated by running the following command:

Tfsmigrator.exe prepare /collection:http://localhost:8080/tfs/defaultcollection /tenantDomainname:yourdomain.onmicrosoft.com

  • Make sure that the collection URL is pointing to the TFS collection you want to migrate.
  • Make sure that the TenantDomainname points to your Azure Active Directory domain; for example: yourdomain.onmicrosoft.com
  • When running this command; you need to login with an account that is part of the Azure Active Directory

After running the command, you will find next to the Tfsmigrator.exe a folder: /log/defaultcollection/{id}/. In this folder you will find the files that are prepared for the migration. There are two important files:

 

File Desciption
Import.json This file contains the configuration that is used during the migration.
IdentityMap.csv This file contains the mapping of the local identities and Azure Active Directory identities

3.1 import.json

The import.json file contains the configuration that is needed during the migration. In this step we will edit import.json so it meets our requirements. The import.json has the following structure:

{
  "Source": {
    "Location": "<Provide the SASKey to the Azure storage container with the collection and import files.>",
    "Files": {
      "IdentityMapping": "IdentityMap.csv",
      "Dacpac": "Tfs_DefaultCollection.dacpac"
    }
  },
  "Target": {
    "AccountName": "<Provide a name for the account that will be created during the import.>",
    "Properties": {
      "Region": "<Provide the short name for a region: CUS, WEU, MA, EAU, SBR.>"
    }
  },
  "Properties": {
    "ImportCode": "<Provide the Import Code assigned to your Import.>"
  },
  "ValidationData": {
    "DatabaseCollation": "SQL_Latin1_General_CP1_CI_AS",
    "CommandExecutionCount": 535,
    "CommandExecutionTime": 1.8917285090000002,
    "TfsVersion": "Dev15.M112.5",
    "DatabaseTotalSize": 220,
    "DatabaseBlobSize": 129,
    "DatabaseTableSize": 91,
    "ActiveUserCount": 3,
    "TenantId": "a4c70ee8-6ed4-4390-bace-jaanee01",
    "Force": false
  }
}

 

The following fields need to be filled in:

Field Description Example Value
Location SAS key to the azure storage container. We will generate this key later on. https://tsavstsmigration.blob.core.windows.net/tfsmigration?sv=…{key}…
AccoutName Name of the VSTS account that will be created during the migration Myaccount-import
ImportCode The import code that is mentioned in the getting started section of this post 30139ae2-8888-2222-a000-6000e8882390e
Region The region in which the imported VSTS account will be created WEU

3.2 identityMap.csv

The IdentityMap.csv contains the mapping of the local active directory users (that are used in TFS) with the users in the Active Directory. Review this file carefully; if the synchronization between the local active directory and the Azure active directory is setup well; you probably don’t have to change anything. In some cases, not all local active directory users are synced to the Azure active directory. In this case; the csv file will show that a user is not present. It’s recommended to have all active users available in the Azure AD.

Identities that are not in the identity mapping or where the identity mapping information is not complete, will end up as “historical identities.” These identities do not have access to VSTS and do not require licenses.

 

4. Import

After preparation, it’s time to do the import. The first thing that needs to be done is extracting the collection database to a .dacpac file. The .dacpac file and IdentityMap.csv file will be uploaded to an Azure Storage Account. For this Azure storage account we will create a SASKey. The SAS key will be filled in in the Import.json file, and the migration can start.

4.1 Extracting the collection database

To extract the collection database to a .dacpac file, SqlPackage.exe is required. SqlPackage.exe is part of the SQL data tools. SQL data tools can be downloaded here. Once the SQL data tools has been installed, navigate with a command prompt to: C:\Program Files (x86)\Microsoft Visual Studio 14.0\Common7\IDE\Extensions\Microsoft\SQLDB\DAC\130. This is where you will find SqlPackage.exe.

By running the following command, you are able to create a .dacpac file from your collection database:

SqlPackage.exe /sourceconnectionstring:”Data Source=SQL-SVR01/SQL01;Initial Catalog=Tfs_DefaultCollection;Integrated Security=True” /targetFile:c:\Dacpac\tfs_defaultcollection.dacpac /action:extract /p:ExtractAllTableData=true /p:IgnoreUserLoginMappings=true /p:IgnorePermissions=true /p:Storage=Memory

  • Make sure that the “data source” is pointing to the SQL instance that is used by TFS. If your are using the default instance of SQL Server, you only have to enter the server name here.
  • Make sure that “initial catlog” is pointing to the SQL database that corresponds with the team project collection you want to migrate.
  • The path you are entering for “targetFile” is the path where the *.dacpac file will be created

4.2 Uploading the .dacpak file and identityMap.csv to an Azure Storage Account

The migration requires that you place your .dacpac file and identityMap.csv file in a Azure Storage Container. The storage account need to be in the same region as where you want to have your VSTS instance; for example: West Europe. The storage account is only for temporary use. After the migration, you can remove the storage account.

To create a storage account:

  • Navigate to the Azure Portal
  • Click “+” and select Storage
  • Now select Storage account – blob, file, table, queue
  • Fill in the following fields:
    • Name: name of the storage account; for example: TfsMigration01
    • Deployment model: Resource Manager
    • Account kind: General purpose
    • Performance: Standard
    • Replication: Read-access-geo-redundant storage (RA-GRS)
    • Storage service encryption: disabled
    • Subscription: You azure subscription
    • Resourcegroup: Create a new one or use an existing one.
    • Location: the same location as where you want to have your VSTS account.
  • Click Create

The .dacpac file needs to be placed in a container within the storage account. Create a container in the storage account that just has been created:

  • In the Azure Portal, navigate to the storage account that has been created earlier.
  • Click on Blobs
  • Click on “+ Container
  • Fill in the following fields:
    • Name: name of the container; for example: migrationcontainer
    • Access type: Blob
  • Click on Create

When the storage account and container are in place; it’s time to upload the .dacpac file and IdentityMap.csv file. You can do this via the Azure Portal (navigate to the container and click on upload). You can upload your files via the default settings.

4.3 Retrieving Azure Storage key

The SAS-key that is required in the import.json file is used the access the storage account. You can generate a SAS-key by running the following script:

$storageAccountName = "{Storage Account Name}"
$storageAccountKey = "{Storage Account Primary Access Key}"
$container = "{Container Name}"

$context = New-AzureStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $storageAccountKey
$sas = New-AzureStorageContainerSASToken -Name $container -Permission rl -Context $context -ExpiryTime ([DateTime]::UtcNow.AddDays(7)) 
Write-Host "$($context.BlobEndPoint)$($container)$($sas)"

In the above script; fill in the following variables:

Variable Description Example Value
Storage Account Name The name of the storage account that was created earlier. TfsMigration01
Storage Account Primary Access Key The primary storage account key (can be found on the “access key” pane of your storage account) 6K53NO76Y4WcRtxY/LWK43/E9VUeSaiDtHGEWQUBFO9
gTh8RyITubsD25kklsXpWwDyrVWM5GdpEUIRVe+HCrDg==
Container Name The import code that is mentioned in the getting started section of this post Migrationcontainer

Running the above script will get you a URL like:

https://tsavstsmigration.blob.core.windows.net/tfsmigration?sv=2015-04-05&sr=c&sig=e352oxhtRF33tJ%2FATFFaQOabtnvZ2ATytpZELykfW5E%3D&se=2017-04-26T12%3A51%3A00Z&sp=rl

4.4 import.json

Now that the .dacpac file and identityMap.csv file has been uploaded to a storage account, the import.json file can be completed. The following fields need to be filled in:

Field Description Example Value
Location SAS key to the azure storage container. https://tsavstsmigration.blob.core.windows.net/tfsmigration?sv=…{key}…
IdentityMapping Name of the IdentityMap.csv IdentityMap.csv
Dacpac Name of the .dacpac file Tfs_defaultCollection.dacpac

 

4.5 Schedule Import

All settings, files, storage accounts etc. are set. It is now time to schedule the migration. The migration can be scheduled via the following command:

TfsMigrator.exe import /importFile:C:\tfs_vsts_migration\import.json

Your migration has now been scheduled in the systems of Microsoft. In short time you will receive an e-mail that the migration has been started. You will receive a second e-mail when the migration has been completed.

If this was your dry run, you can now go ahead with the other migration key to migrate your production environment.

 

6. After the migration

The migration is now completed. Your VSTS account is fully operational and users can use it. Although everything is functional; some small things need to take care of:

  • Add your build agents to the VSTS environment (deploy an agent on your servers)
  • Remove the temporary created storage account

 

Content about TFS to VSTS migrations

Please note that there is a lot of great content about TFS to VSTS migrations. Please check out the following links:

Share This