TFS Drop Location Clean Up Utility in PowerShell

In TFS, we do build Automation and the output of the build Automation is kept in some DFS share called the drop location. When you build the application multiple times using the automated build process, it generates the output and keeps it in the drop location. Let’s say, you have application “abc” and you did build of it ten times. It will copy the output to drop location in different folders like –

[DFS Share]\abc\

[DFS Share]\abc\

[DFS Share]\abc\


[DFS Share]\abc\

In long interval of time, the drop location will hold n number of output folders. Now, think what will happen when we have multiple applications and multiple teams are working on it. It will consume lots of space.

To address this issue, one has to manually manage the drop location. Manual activity is always time consuming and prone to errors.

One has to look for some kind of automation to manage the drop location. Here are the basic requirements –

1.) We need to delete the output folder older then n (say 10 days) number of days.

2.) We need to keep last three output folders even if they are older then n (say 10 days) number of days.

3.) It should send email to the concerned team/person for the notification of the clean-up done.

These two requirements can be easily implemented using PowerShell scripting.

Here is the script content –

Minimum PowerShell version required is v5.


   1: <#

   2: .Synopsis

   3:    Script is to delete all the folders with the content from the TFS Drop location which are older then n number of days.


   5:    Script is to delete all the folders with the content from the TFS Drop location which are older then n number of days.

   6: #>


   8: #Updated - First Version


  10: $Global:DataForEmailBody


  12: Function SendEmail {


  14: $Email = New-Object -ComObject "CDO.Message"

  15: $Email.Configuration.Fields.Item("") = 2

  16: $Email.Configuration.Fields.Item("") = 'XXXXX'

  17: $Email.Configuration.Fields.Item("") = 25

  18: $Email.Configuration.Fields.Update()


  20: $Email.From=""

  21: $Email.To=""

  22: $Email.Subject="Utility | Scheduled Job | TFS Drop Location Cleanup Activity"

  23: $Email.HTMLBody = "<font color=""black"" face=""Calibri, Verdana"" size=""3"">

  24:                     <p><b> All, </b></p>

  25:                     <p><mark> fyi....</mark></p>

  26:                     <p><b> </b></p>

  27:                     <p> TFSDropLocationCleanupUtility has cleanup the drop location. Here are the details...</p>

  28:                     <p><b> </b></p>

  29:                     <p>----------------------------------------------</p>

  30:                     <p><b> </b></p>

  31:                     <a> $Global:DataForEmailBody </a>                    

  32:                     <p>----------------------------------------------</p>                                                           

  33:                     <p> Regards, TFS Administration Team </p>

  34:                         <font color=""blue"" face=""Arial, Verdana"" size=""2""

  35:                         <p> This is an auto generated email from the scheduled Job in TFS. If you need more information, contact TFS admin </p>"


  37: $Email.Send()

  38: }


  40: #Start-Transcript -Path ".\TFSDropLocCleanupUtility.log"

  41: $DayCount = 30 #Folders to be deleted prior to current - this day count


  43: $dt = (Get-Date).AddDays(-$DayCount)

  44: $FolderCountsToSkip = 3

  45: $DropLocation = "P:\temp"

  46: $flag = $false


  48: $Global:DataForEmailBody = "<p> Batch Started.....</p>"


  50: $RootFolder = Get-ChildItem $DropLocation -Directory


  52: foreach ($_RootFolder in $RootFolder)

  53: {

  54: #Write-Host $_RootFolder


  56:    $FolderFirstLevel = Get-ChildItem $DropLocation\$_RootFolder -directory -recurse -depth 0 | Where-Object {$_.Name -notmatch "^[0-9].(\d+(\.\d+){1,4})"} 

  57:    foreach ($_FolderFirstLevel in $FolderFirstLevel)

  58:     {

  59:        # Write-Host $_FolderFirstLevel.FullName


  61:         $FolderInfo = Get-ChildItem $_FolderFirstLevel.FullName | Where{$_.LastWriteTime -lt $dt} | sort { [version]($_.Name -replace '^[0-9].(\d+(\.\d+){1,4})', '$1') } -Descending | Select-Object -skip $FolderCountsToSkip

  62:         foreach ($_FolderInfo in $FolderInfo)

  63:         {

  64:             Write-Host $_FolderInfo.FullName " Deleted!!!"

  65:             Remove-Item $_FolderInfo.FullName -Force -Recurse

  66:             $Global:DataForEmailBody += "<p>" + $_FolderInfo.FullName  + " Deleted!!! </p>"

  67:             $flag = $true

  68:         }

  69:     }



  72:    $FolderFirstLevel = Get-ChildItem $DropLocation\$_RootFolder -directory -recurse -depth 0 | Where-Object {$_.Name -match "^[0-9].(\d+(\.\d+){1,4})"} | Where{$_.LastWriteTime -lt $dt} | sort { [version]($_.Name -replace '^[0-9].(\d+(\.\d+){1,4})', '$1') } -Descending | Select-Object -skip $FolderCountsToSkip

  73:    foreach ($_FolderFirstLevel in $FolderFirstLevel)

  74:     {

  75:         Write-Host $_FolderFirstLevel.FullName " Deleted!!!"

  76:         Remove-Item $_FolderFirstLevel.FullName -Force -Recurse

  77:         $Global:DataForEmailBody += "<p>" + $_FolderFirstLevel.FullName  + " Deleted!!! </p>"

  78:         $flag = $true

  79:     }

  80: }


  82: if($flag)

  83: {


  85:     SendEmail 

  86: }

  87: else

  88: {

  89:     $Global:DataForEmailBody += "<p> Drop Location is already upto date. No cleanup required for today!! </p>"

  90:     SendEmail

  91: }



I have used regular expression for finding the folder name in the drop location. In my case, the folder name is based on version in the format like You need to modify the regular express based on the naming convention in your drop location.

Please note the below screenshot for first two requirements.


Schedule the above PowerShell script to the task scheduler or as a TFS job to run it daily.



On every successful execution, it will cleanup the drop location based on the number of days and number of folders to keep. Finally it will send the email with the information of the folders that it has deleted.

Email Content will be like below –

Case 1 – Folders deleted and clean up done –

Email Content –



TFSDropLocationCleanupUtility has cleanup the drop location. Here are the details…


Batch Started…..

P:\temp\abcd\ DELETED!!!

P:\temp\abcd\ DELETED!!!

P:\temp\abcd\ DELETED!!!

P:\temp\abcd\ DELETED!!!

P:\temp\xyz\ DELETED!!!

P:\temp\zzzz\Desktop\ DELETED!!!

P:\temp\zzzz\Desktop\ DELETED!!!


Regards, TFS Administration Team

This is an auto generated email from the scheduled Job in TFS. If you need more information, contact TFS admin


Case 2 – Drop location is already up to date –

Email Content –



TFSDropLocationCleanupUtility has cleanup the drop location. Here are the details…


Batch Started…..

Drop Location is already upto date. No cleanup required for today!!


Regards, TFS Administration Team

This is an auto generated email from the scheduled Job in TFS. If you need more information, contact TFS admin

——End of the Article—-


Docker–Windows Server 2016

Requirement is to understand how to deploy IIS based website in the container.

To play around, create a Windows Server 2016 based VM in MS Azure.

Steps to follow –

  1. Configure Docker environment in Windows Server 2016:

            Run the below commands –

      • Install-PackageProvider -Name NuGet -MinimumVersion -Force
      • Install-Module -Name DockerMsftProvider –Force
      • Install-Package -Name docker -ProviderName DockerMsftProvider -Force
      • Restart-Computer –Force


        2.   To verify, it it is installed correctly, Run the command –

              docker version 



        3. Next step is to pull the IIS image from docker secured registry. Run the below command –

            docker pull Microsoft/iis

            This command will take some time to pull and extract the IIS image onto the server. Once completed successfully, run the command “docker images” to list the images pulled from DSR.




     4. Run the following command to start the container –

         docker run –d –name myFirstIIS –p 80:80 microsoft/iis

     5. Run the following command to list all the running containers –

         docker ps –a


         docker container ls

     6. Run the following command to open the command prompt inside the container – docker exec –I myFirstIIS cmd

     7. Write index.html in the inetpub of the IIS in the container as –

      echo "Hello World From a Windows Server Container" > C:\inetpub\wwwroot\index.html


     7. Browse the IIS site as shown below –



All the steps above can be done through DockerFile concept. I will explain it in the next article.


                  ——End of the Article—–

Code signing using PowerShell Scripting

Scenario – You have a requirement to sign the output of your application. The output can be in the form of .dll, .exe, .ocx etc. You need to sign all of them before distributing them to others.

Requirements – To implement the code signing, you will need to things –

  1. Security certificate
  2. Timestamp server URL

You need to install certificate on the server where you want to sign your code from. Insure that the certificate should be non-exportable or else some one will export it and use it.

Timestamp is needed to ensure the validity of the signed code.

Implementation – With the above requirements in place, we can use the below PowerShell Cmdlets to sign the code –

  1. Set-AuthenticodeSignature – This is to add the authenticode signature to the file.
  2. Get-AuthenticodeSignature – This is to get information about the Authenticode signature in the file. With this cmdlet, you can find if the file has valid signature or not. 

I have written below script to sign any .dll, .exe, .ocx files present in the base location where this script is placed, if they don’t have valid signature.

It does below activities –

  1. It is reading certificate from the store on the server.
  2. It is then finding the certificate that we want to use from the output of above command and then storing it into the variable to be used.
  3. There is a command line in the script to get the full path of all the files (.dll, .ocx, .exe) from all the folders and subfolders where the script is kept.
  4. It is then finding if the file (with full path) as found in step 3 above has a valid signature or not. If it has valid signature, it is ignoring this file and going to next file. If the file is not signed, it is calling function “Codesigning()” to sign it.
  5. All the activities are captured in the log file at the same location as well as displaying on the console.

Here is the PowerShell Script –

   Script is to sign all internal built (.dll, .exe, .ocx) file outputs

function CodeSigning ()
    Param (

    write-host "———————————————"
    Write-Host "FileName – " $FileNameWithPath
    Write-Host "Code Signing Started–"
    Set-AuthenticodeSignature $FileNameWithPath $CertInfo -TimestampServer
    Write-Host "Code Signing Finished Successfully"
    write-host "———————————————"

Start-Transcript -Path ".\codesigningTrans.log"
$cert= (dir cert:localmachine\my\ -CodeSigningCert)
write-host "————All Certificate Information from the server———-"
write-host $cert
write-host "———————————————"
foreach ($_cert in $cert)
    if($_cert.Thumbprint -eq "ReplaceWithACTUAL")
        $CertInfo = $_cert
        Write-Host "————Certificate in Use start————–"
        Write-Host $CertInfo
        Write-Host "————Certificate in Use End————–"
        $FileData = Get-ChildItem -rec | Where {$_.Extension -in ".dll",".exe",".ocx"}  | ForEach-Object -Process {$_.FullName}

        foreach ($_FileData in $FileData)
            $FileNameWithPath = $_FileData           
            $IsValid = Get-AuthenticodeSignature  $FileNameWithPath | where {$_.Status -eq "Valid"}
            if (!$IsValid.Status -eq "Valid")
                CodeSigning $FileNameWithPath $CertInfo
                Write-Host $FileNameWithPath " already has valid signature"
$log = Get-Content ".\codesigningTrans.log"
$log > ".\codesigningTrans.log"

————————-End of Article———————-

Connecting TFS GIT in LINUX

Problem –

Developers were getting issue when they were trying to do GIT operation (like clone) in TFS from LINUX (CentOS). They were getting certificate issues.

Resolution –

Solution is to generate SSH key in LINUX and set the same in TFS. With it, Git in LINUX is able to handshake with TFS.

Here are the steps that needs to be performed in LINUX –

1. Generate SSH key. Run the below command - 

      ssh-keygen -t rsa -C "emailID"

Note – “emailID is your email ID.

2. Run the below command to get the key and copy the output key as shown below –

      cat ~/.ssh/


Go to the TFS web portal and follow the below screens –

3. Click on person icon and select Security option.


4. Below screen will appear. Select “SSH Public Keys” option and click on Add. It will ask for the key and enter the key, you copied in step 2 above.


Now go to the LINUX environment and try to do git clone operation. It should work.

                      ——–End of Article——

TFS–Automated Builds–Agent Workspace–Application folder

In the current scenario, when we need to build any application through build definition in TFS, it downloads all the sourcecode/components in the Agent’s workspace in a folder with a “digit” as a name of folder instead of the application name. For example, in case of xxxBillSplit application, it downloaded the source code in folder “7” under the Agent workspace.

For many applications like xxxBillSplit, it is perfectly ok but for application like XYZ where we have cross team project references, it is not working and getting failed.

To address this issue for complex applications like XYZ, following changes can be done –

  1. Under Agent’s workspace, a folder with the name “SourceRootMapping” get created as soon as agent builds the first application. This is single folder for all the applications agent is building. Under this folder, there are folders for all the build definitions with the collection id (GUID) as name a shown below – image
  2. Under GUID folder, there is a folder with the name as build definition ID. Both the above information about GUID and build ID can be found from the build definition as shown below –image
  3. Under the build ID folder, there is json file called “SourceFolder.json” which contains information about the builds as shown below – Please note “7” referencing in many places that I have shown in highlighted box –image
  4. Replace build id (7 here) as highlighted above with the application name as shown below.image
  5. Once done, rebuild the application. Folder with the application name will be created. You can now delete the folder with ID (7 here) as shown below – image

This activity needs to be done for all the team projects and it is one-time activity. Once done, backup of this folder (SourceRootMapping) can be taken and in case of new server/new agent/new workspace, this folder can be restored to implement the changes.


                 ————End of the Article————-

App-V Package Publishing in XenApp 7.8

Purpose –

The purpose of this blog post is to state the steps required to publish virtualized package created using Microsoft App-V in Citrix XenApp 7.8.

App-V Package Publish Steps for XenApp 7.8 –

Follow the below steps –

1. Launch “Citrix Studio”.

2. Go to the node Configuration -> App-V Publishing. Right click on “App-V Publishing”. Click on Add Package.


3. It will open window to browse App-V package (.appv) file. Select the file with extension “.appv” and click Open.


Below screen will appear…


On successful addition of package, below screen will appear:


4. Now, go to the “Applications” node, right click it and select “Add Applications” option as shown below –


5. Below screen will appear. Click Next.


6. Select the “Delivery group” where the virtualized application will be delivered.


7. Click on Add button and select the “App-V..” option as shown below –


Below screen will be displayed. It will show the list of App-V packages we add before. Select the respective one and click Ok. It will close the below screen and will take the control to the above screen. Click Next and it will take to the summary screen and click finish to add the application.


8. Once the application is added, right click on the added application and select the properties to add the users for its access control. Below screen will appear.


Add the required users as shown below –


Application is now published in XenApp 7.8 and is now ready to use.

DevOps – Release Perspective

Hi Folks…

We are hearing “DevOps” buzz word these day often….so I thought of writing my understanding on it from release perspective.

What is it? What impact it can put on us?

Let’s think of life without it….let’s sit on time machine and go in the past…How IT worked during that time? Company has business and they need IT to automate its activities…Business gives the requirements to IT. IT has many groups, like Development group, QC/QA group, operations group that deal with production/infrastructure support etc.

Information flows from one group to another in a sequential manner. Once IT receives requirements from the business, it does some feasibility analysis and then assigns the same to the development group. This assignment can be manual (via emails/excel) or through some tools.

Development team starts working on it and their only goal was to implement the requirements given by the business and they have not had any idea about the actual production environment. Development team are using tools for versioning their code and they may or may not be using any tool to automate the testing/build.

In absence of any tool, they had to do building of code and testing all manual. With long deadlines and once in a while release, it worked fine. Development team writes code and at the end of coding they were doing full builds and the system testing. Finding what broke their existing feature is a costly thing. But all well, as far as, they are delivering the software to business. Business will find the issues, they will contact Development team again and the same process will go on till the time acceptable software is delivered to operations.

Till this time there is hardly any talk between Development team and operations. Many times operations with their understanding of production, have many important recommendation/s, but since they are in picture at the end, Development team doesn’t accept their recommendations and they have to carry on the release. Operations might find it difficult or even not possible to put the software in production as is, as it is not designed in a way, software can be hosted on that particular environment. Good example, is with XenApp. In XenApp, since many users will be accessing the application from same XenApp server and if the application is writing some information in a common file, the last user will overwrite the information of previous user. Such things needs to be addressed in the development phase but since Development team is not aware such changes needs to be done at the end that calls for additional testing and so additional cost.

These are just the tip of iceberg. There can have lots of problem just because of less coordination between development team and operations.

In the current dynamic business scenarios, business is changing like anything and they need their changes to reflect on production as soon as possible. If changes are taking time, you are out of the business. Competition is huge. With old traditional methods, we cannot continue. Many companies like Amazon, Google have demonstrated that they can have multiple releases to production in a day. It has increased the expectation of the management to many folds. How these guys have made it possible? If they can deliver, why can’t we? It has many perspective and we all should understand what differently they have done to make it happen? Do they have some magic? Not at all….Let’s discuss it…

First step is to change the way we think, if they can do, we can too…

Second step is to think how to make all or most of the manual activities automated. Many manual activities that we do often can be easily automated like automated build, automated testing, automated deployments etc. We can have continuous integration and continuous delivery to make things moving fast. How it works? Let’s discuss it.

Developer have just finished work on the task he was working and now wants to do check-in. He has two options, one is to do build manually on the development machine and do some initial unit testing and then do a check in. Second is that on check-in, the already written unit test case will be run automatically and the build will be done on the independent machine. This second option is called continuous integration and with it, developers will get the results in few minutes and their build is also pristine. In the first option above, developer has to do the unit test themselves and they may miss few test cases and that can be costly at the time of the actual release.

In continuous delivery, the build files are continuously getting deployed in to the target environment on every change with proper approval workflow in place.

The essence of the continuous integration and delivery is maximum automation. For the management, it seems to be like clicking a button. Enabling continuous integration and continuous delivery is itself a time consuming thing as it sometimes needs scripting/programming. Writing such scripts at the time of release will put extra effort and time that might defeat the actual purpose of the automation. Then how to create them fast?

Here comes the main point behind DevOps. DevOps = Dev + Ops. Both the development team and operations have to have an excellent coordination and communication from the beginning of the software life cycle. This way automation can be implemented while development team is doing coding. As soon as the automation framework is created, development team can start using it in their day to day activity which will refine it to the maximum and at the time of actual release, framework is already in place and a click can make release a piece of cake. It will always deliver with great accuracy and predictability.