Visual Studio 2017 – vNext Build Automation – .vdproj based installer

Problem – We need to build “.vdproj” (installer) based project in Visual Studio 2017 using vNext Build Automation.

Solution

Follow the below steps –

  1. First step is to configure the build environment.
    1. Install “Microsoft Visual Studio 2017 Installer Projects (InstallerProjects.vsix)”. Download it from https://marketplace.visualstudio.com/items?itemName=VisualStudioProductTeam.MicrosoftVisualStudio2017InstallerProjects.
    2. To install it, double clink on InstallerProjects.vsix and follow the default option to complete the installation.
    3. Add the below highlighted key. To add it,
      1. Add the node 15.0_[User ID]_Config like as shown below 15.0_a71083cb_Config.
      2. Add MSBuild folder (key) in it.
      3. In MSBuild, create new key “EnableOutOfProcBuild” and set its value as 0.

          image

  1. In TFS Build Definition, add task for command line and add the below information –
    1. ToolC:\Program Files (x86)\Microsoft Visual Studio\2017\Enterprise\Common7\IDE\devenv.com
    2. Arguments$(Build.SourcesDirectory)\Deployment.sln /build $(BuildConfiguration)
  2. Finally, initiate the build and it will build the “.vdproj” files as defined in the solution and will create MSI installer file.

TFS Drop Location Clean Up Utility in PowerShell

In TFS, we do build Automation and the output of the build Automation is kept in some DFS share called the drop location. When you build the application multiple times using the automated build process, it generates the output and keeps it in the drop location. Let’s say, you have application “abc” and you did build of it ten times. It will copy the output to drop location in different folders like –

[DFS Share]\abc\1.0.0.0

[DFS Share]\abc\1.0.0.1

[DFS Share]\abc\1.0.0.2

….

[DFS Share]\abc\1.0.0.10

In long interval of time, the drop location will hold n number of output folders. Now, think what will happen when we have multiple applications and multiple teams are working on it. It will consume lots of space.

To address this issue, one has to manually manage the drop location. Manual activity is always time consuming and prone to errors.

One has to look for some kind of automation to manage the drop location. Here are the basic requirements –

1.) We need to delete the output folder older then n (say 10 days) number of days.

2.) We need to keep last three output folders even if they are older then n (say 10 days) number of days.

3.) It should send email to the concerned team/person for the notification of the clean-up done.

These two requirements can be easily implemented using PowerShell scripting.

Here is the script content –

Minimum PowerShell version required is v5.

——————————-

   1: <#

   2: .Synopsis

   3:    Script is to delete all the folders with the content from the TFS Drop location which are older then n number of days.

   4: .DESCRIPTION

   5:    Script is to delete all the folders with the content from the TFS Drop location which are older then n number of days.

   6: #>

   7:  

   8: #Updated - First Version

   9:  

  10: $Global:DataForEmailBody

  11:  

  12: Function SendEmail {

  13:  

  14: $Email = New-Object -ComObject "CDO.Message"

  15: $Email.Configuration.Fields.Item("http://schemas.microsoft.com/cdo/configuration/sendusing") = 2

  16: $Email.Configuration.Fields.Item("http://schemas.microsoft.com/cdo/configuration/smtpserver") = 'XXXXX'

  17: $Email.Configuration.Fields.Item("http://schemas.microsoft.com/cdo/configuration/smtpserverport") = 25

  18: $Email.Configuration.Fields.Update()

  19:  

  20: $Email.From="TFSAdmin@xyz.com"

  21: $Email.To="abcd@xyz.com"

  22: $Email.Subject="Utility | Scheduled Job | TFS Drop Location Cleanup Activity"

  23: $Email.HTMLBody = "<font color=""black"" face=""Calibri, Verdana"" size=""3"">

  24:                     <p><b> All, </b></p>

  25:                     <p><mark> fyi....</mark></p>

  26:                     <p><b> </b></p>

  27:                     <p> TFSDropLocationCleanupUtility has cleanup the drop location. Here are the details...</p>

  28:                     <p><b> </b></p>

  29:                     <p>----------------------------------------------</p>

  30:                     <p><b> </b></p>

  31:                     <a> $Global:DataForEmailBody </a>                    

  32:                     <p>----------------------------------------------</p>                                                           

  33:                     <p> Regards, TFS Administration Team </p>

  34:                         <font color=""blue"" face=""Arial, Verdana"" size=""2""

  35:                         <p> This is an auto generated email from the scheduled Job in TFS. If you need more information, contact TFS admin </p>"

  36:  

  37: $Email.Send()

  38: }

  39:  

  40: #Start-Transcript -Path ".\TFSDropLocCleanupUtility.log"

  41: $DayCount = 30 #Folders to be deleted prior to current - this day count

  42:  

  43: $dt = (Get-Date).AddDays(-$DayCount)

  44: $FolderCountsToSkip = 3

  45: $DropLocation = "P:\temp"

  46: $flag = $false

  47:  

  48: $Global:DataForEmailBody = "<p> Batch Started.....</p>"

  49:  

  50: $RootFolder = Get-ChildItem $DropLocation -Directory

  51:  

  52: foreach ($_RootFolder in $RootFolder)

  53: {

  54: #Write-Host $_RootFolder

  55:  

  56:    $FolderFirstLevel = Get-ChildItem $DropLocation\$_RootFolder -directory -recurse -depth 0 | Where-Object {$_.Name -notmatch "^[0-9].(\d+(\.\d+){1,4})"} 

  57:    foreach ($_FolderFirstLevel in $FolderFirstLevel)

  58:     {

  59:        # Write-Host $_FolderFirstLevel.FullName

  60:         

  61:         $FolderInfo = Get-ChildItem $_FolderFirstLevel.FullName | Where{$_.LastWriteTime -lt $dt} | sort { [version]($_.Name -replace '^[0-9].(\d+(\.\d+){1,4})', '$1') } -Descending | Select-Object -skip $FolderCountsToSkip

  62:         foreach ($_FolderInfo in $FolderInfo)

  63:         {

  64:             Write-Host $_FolderInfo.FullName " Deleted!!!"

  65:             Remove-Item $_FolderInfo.FullName -Force -Recurse

  66:             $Global:DataForEmailBody += "<p>" + $_FolderInfo.FullName  + " Deleted!!! </p>"

  67:             $flag = $true

  68:         }

  69:     }

  70:  

  71:  

  72:    $FolderFirstLevel = Get-ChildItem $DropLocation\$_RootFolder -directory -recurse -depth 0 | Where-Object {$_.Name -match "^[0-9].(\d+(\.\d+){1,4})"} | Where{$_.LastWriteTime -lt $dt} | sort { [version]($_.Name -replace '^[0-9].(\d+(\.\d+){1,4})', '$1') } -Descending | Select-Object -skip $FolderCountsToSkip

  73:    foreach ($_FolderFirstLevel in $FolderFirstLevel)

  74:     {

  75:         Write-Host $_FolderFirstLevel.FullName " Deleted!!!"

  76:         Remove-Item $_FolderFirstLevel.FullName -Force -Recurse

  77:         $Global:DataForEmailBody += "<p>" + $_FolderFirstLevel.FullName  + " Deleted!!! </p>"

  78:         $flag = $true

  79:     }

  80: }

  81:  

  82: if($flag)

  83: {

  84:  

  85:     SendEmail 

  86: }

  87: else

  88: {

  89:     $Global:DataForEmailBody += "<p> Drop Location is already upto date. No cleanup required for today!! </p>"

  90:     SendEmail

  91: }

  92:  

  93:  

I have used regular expression for finding the folder name in the drop location. In my case, the folder name is based on version in the format like aa.bb.cc.dd. You need to modify the regular express based on the naming convention in your drop location.

Please note the below screenshot for first two requirements.

clip_image001

Schedule the above PowerShell script to the task scheduler or as a TFS job to run it daily.

 

clip_image001[4]

On every successful execution, it will cleanup the drop location based on the number of days and number of folders to keep. Finally it will send the email with the information of the folders that it has deleted.

Email Content will be like below –

Case 1 – Folders deleted and clean up done –

Email Content –

All,

fyi….

TFSDropLocationCleanupUtility has cleanup the drop location. Here are the details…

———————————————-

Batch Started…..

P:\temp\abcd\1.0.0.52 DELETED!!!

P:\temp\abcd\1.0.0.51 DELETED!!!

P:\temp\abcd\1.0.0.46 DELETED!!!

P:\temp\abcd\1.0.0.45 DELETED!!!

P:\temp\xyz\1.0.0.69 DELETED!!!

P:\temp\zzzz\Desktop\1.0.0.36 DELETED!!!

P:\temp\zzzz\Desktop\1.0.0.31 DELETED!!!

———————————————-

Regards, TFS Administration Team

This is an auto generated email from the scheduled Job in TFS. If you need more information, contact TFS admin

————————

Case 2 – Drop location is already up to date –

Email Content –

All,

fyi….

TFSDropLocationCleanupUtility has cleanup the drop location. Here are the details…

———————————————-

Batch Started…..

Drop Location is already upto date. No cleanup required for today!!

———————————————-

Regards, TFS Administration Team

This is an auto generated email from the scheduled Job in TFS. If you need more information, contact TFS admin

——End of the Article—-

Docker–Windows Server 2016

Requirement is to understand how to deploy IIS based website in the container.

To play around, create a Windows Server 2016 based VM in MS Azure.

Steps to follow –

  1. Configure Docker environment in Windows Server 2016:

            Run the below commands –

      • Install-PackageProvider -Name NuGet -MinimumVersion 2.8.5.201 -Force
      • Install-Module -Name DockerMsftProvider –Force
      • Install-Package -Name docker -ProviderName DockerMsftProvider -Force
      • Restart-Computer –Force

 

        2.   To verify, it it is installed correctly, Run the command –

              docker version 

            image

 

        3. Next step is to pull the IIS image from docker secured registry. Run the below command –

            docker pull Microsoft/iis

            This command will take some time to pull and extract the IIS image onto the server. Once completed successfully, run the command “docker images” to list the images pulled from DSR.

 

         clip_image001

 

     4. Run the following command to start the container –

         docker run –d –name myFirstIIS –p 80:80 microsoft/iis

     5. Run the following command to list all the running containers –

         docker ps –a

          Or

         docker container ls

     6. Run the following command to open the command prompt inside the container – docker exec –I myFirstIIS cmd

     7. Write index.html in the inetpub of the IIS in the container as –

      echo "Hello World From a Windows Server Container" > C:\inetpub\wwwroot\index.html

        image

     7. Browse the IIS site as shown below –

        clip_image002

 

All the steps above can be done through DockerFile concept. I will explain it in the next article.

 

                  ——End of the Article—–

Code signing using PowerShell Scripting

Scenario – You have a requirement to sign the output of your application. The output can be in the form of .dll, .exe, .ocx etc. You need to sign all of them before distributing them to others.

Requirements – To implement the code signing, you will need to things –

  1. Security certificate
  2. Timestamp server URL

You need to install certificate on the server where you want to sign your code from. Insure that the certificate should be non-exportable or else some one will export it and use it.

Timestamp is needed to ensure the validity of the signed code.

Implementation – With the above requirements in place, we can use the below PowerShell Cmdlets to sign the code –

  1. Set-AuthenticodeSignature – This is to add the authenticode signature to the file.
  2. Get-AuthenticodeSignature – This is to get information about the Authenticode signature in the file. With this cmdlet, you can find if the file has valid signature or not. 

I have written below script to sign any .dll, .exe, .ocx files present in the base location where this script is placed, if they don’t have valid signature.

It does below activities –

  1. It is reading certificate from the store on the server.
  2. It is then finding the certificate that we want to use from the output of above command and then storing it into the variable to be used.
  3. There is a command line in the script to get the full path of all the files (.dll, .ocx, .exe) from all the folders and subfolders where the script is kept.
  4. It is then finding if the file (with full path) as found in step 3 above has a valid signature or not. If it has valid signature, it is ignoring this file and going to next file. If the file is not signed, it is calling function “Codesigning()” to sign it.
  5. All the activities are captured in the log file at the same location as well as displaying on the console.

Here is the PowerShell Script –

<#
.Synopsis
   Script is to sign all internal built (.dll, .exe, .ocx) file outputs
#>

function CodeSigning ()
{
    Param (
            [Parameter(Mandatory=$True)]
            [ValidateNotNull()]
            $FileNameWithPath,           
            [Parameter(Mandatory=$True)]
            [ValidateNotNull()]
            $CertInfo
          )

    write-host "———————————————"
    Write-Host "FileName – " $FileNameWithPath
    Write-Host "Code Signing Started–"
    Set-AuthenticodeSignature $FileNameWithPath $CertInfo -TimestampServer
http://TimeStampURL
    Write-Host "Code Signing Finished Successfully"
    write-host "———————————————"
}

Start-Transcript -Path ".\codesigningTrans.log"
$cert= (dir cert:localmachine\my\ -CodeSigningCert)
write-host "————All Certificate Information from the server———-"
write-host $cert
write-host "———————————————"
foreach ($_cert in $cert)
{
    if($_cert.Thumbprint -eq "ReplaceWithACTUAL")
    {
        $CertInfo = $_cert
        Write-Host "————Certificate in Use start————–"
        Write-Host $CertInfo
        Write-Host "————Certificate in Use End————–"
        $FileData = Get-ChildItem -rec | Where {$_.Extension -in ".dll",".exe",".ocx"}  | ForEach-Object -Process {$_.FullName}

        foreach ($_FileData in $FileData)
        {
            $FileNameWithPath = $_FileData           
            $IsValid = Get-AuthenticodeSignature  $FileNameWithPath | where {$_.Status -eq "Valid"}
            if (!$IsValid.Status -eq "Valid")
            {
                CodeSigning $FileNameWithPath $CertInfo
            }
            else
            {
                Write-Host $FileNameWithPath " already has valid signature"
            }
        }  
    }
}
Stop-Transcript
$log = Get-Content ".\codesigningTrans.log"
$log > ".\codesigningTrans.log"

————————-End of Article———————-

Connecting TFS GIT in LINUX

Problem –

Developers were getting issue when they were trying to do GIT operation (like clone) in TFS from LINUX (CentOS). They were getting certificate issues.

Resolution –

Solution is to generate SSH key in LINUX and set the same in TFS. With it, Git in LINUX is able to handshake with TFS.

Here are the steps that needs to be performed in LINUX –

1. Generate SSH key. Run the below command - 

      ssh-keygen -t rsa -C "emailID"

Note – “emailID is your email ID.

2. Run the below command to get the key and copy the output key as shown below –

      cat ~/.ssh/id_rsa.pub

image

Go to the TFS web portal and follow the below screens –

3. Click on person icon and select Security option.

image

4. Below screen will appear. Select “SSH Public Keys” option and click on Add. It will ask for the key and enter the key, you copied in step 2 above.

image

Now go to the LINUX environment and try to do git clone operation. It should work.

                      ——–End of Article——

TFS–Automated Builds–Agent Workspace–Application folder

In the current scenario, when we need to build any application through build definition in TFS, it downloads all the sourcecode/components in the Agent’s workspace in a folder with a “digit” as a name of folder instead of the application name. For example, in case of xxxBillSplit application, it downloaded the source code in folder “7” under the Agent workspace.

For many applications like xxxBillSplit, it is perfectly ok but for application like XYZ where we have cross team project references, it is not working and getting failed.

To address this issue for complex applications like XYZ, following changes can be done –

  1. Under Agent’s workspace, a folder with the name “SourceRootMapping” get created as soon as agent builds the first application. This is single folder for all the applications agent is building. Under this folder, there are folders for all the build definitions with the collection id (GUID) as name a shown below – image
  2. Under GUID folder, there is a folder with the name as build definition ID. Both the above information about GUID and build ID can be found from the build definition as shown below –image
  3. Under the build ID folder, there is json file called “SourceFolder.json” which contains information about the builds as shown below – Please note “7” referencing in many places that I have shown in highlighted box –image
  4. Replace build id (7 here) as highlighted above with the application name as shown below.image
  5. Once done, rebuild the application. Folder with the application name will be created. You can now delete the folder with ID (7 here) as shown below – image

This activity needs to be done for all the team projects and it is one-time activity. Once done, backup of this folder (SourceRootMapping) can be taken and in case of new server/new agent/new workspace, this folder can be restored to implement the changes.

 

                 ————End of the Article————-

App-V Package Publishing in XenApp 7.8

Purpose –

The purpose of this blog post is to state the steps required to publish virtualized package created using Microsoft App-V in Citrix XenApp 7.8.

App-V Package Publish Steps for XenApp 7.8 –

Follow the below steps –

1. Launch “Citrix Studio”.

2. Go to the node Configuration -> App-V Publishing. Right click on “App-V Publishing”. Click on Add Package.

image

3. It will open window to browse App-V package (.appv) file. Select the file with extension “.appv” and click Open.

image

Below screen will appear…

image

On successful addition of package, below screen will appear:

image

4. Now, go to the “Applications” node, right click it and select “Add Applications” option as shown below –

image

5. Below screen will appear. Click Next.

image

6. Select the “Delivery group” where the virtualized application will be delivered.

image

7. Click on Add button and select the “App-V..” option as shown below –

 image

Below screen will be displayed. It will show the list of App-V packages we add before. Select the respective one and click Ok. It will close the below screen and will take the control to the above screen. Click Next and it will take to the summary screen and click finish to add the application.

image

8. Once the application is added, right click on the added application and select the properties to add the users for its access control. Below screen will appear.

image

Add the required users as shown below –

image

Application is now published in XenApp 7.8 and is now ready to use.