Why Github Copilot is already awesome
I've mostly worked with Microsoft 365 Copilot for daily productivity, but my recent experience with Github Copilot was simply mind-blowing. This post outlines how I used it and why I think this technology will transform the way people code and make them more productive.
Setting the stage
In my job, I work a lot on PowerShell script libraries for data migrations towards or between Microsoft 365 tenants. I use VS Code for my projects and DevOps for source control (Git) and backlog management. Apart from that I obviously work with Microsoft 365 and Microsoft 365 Copilot.
For a recent project, our objective is to build an information management application in SharePoint Online including SharePoint Framework-based custom web parts. The latter are coded by a developer while my focus is on creating the required scripts to provisioning and configure sites and migration of the data from a 3rd party application. That data was already imported into a temporary site in SharePoint Online and the scripts should support the so called 2nd step migration to the destination locations in the new information management application.
The migration consists of several steps:
- Convert CSV-data into several term sets and list data.
- Create folder structures and document sets with additional metadata.
- Copy files from the temporary site to the final destination.
The CSV-data is basically an ETL’d merge of the source metadata and the desired target architecture. It contains product names, product sub series and product groups with reference to multiple documents linked to these products. Because of the document links, there is duplication of the product names etc. As each row contains the product-related metadata and document names. The sample file I used had approx. 25.000 rows. This is all under NDA, so I hope this helps to understand a bit about the data.
The objective for the term set import script, was to get just the unique product-values from the CSV-data without any of the document-related info. I used PowerShell to import the CSV-data and group on product names to get unique values. From that point onwards I would loop through each grouped row to get the information needed to create the term set data in the right format.
Creating term sets
My goto-module was obviously PnP.PowerShell as this is the best solution to programmatically interact with SharePoint Online. This module has several cmdlets that support importing term sets or taxonomies into the term store. I first used Import-PnPTermSet using |-separated values for the term group, term set, and term hierarchy. When doing so, I encountered several errors on terms that would already exist. Manually importing the term sets would work without any issues, but that was too much manual effort for 50 term sets considering the total dataset.
After several hours of troubleshooting the use of the cmdlet, I tried a different approach. Using the manual import, I was able to successfully import 8 term sets. I then exported all term sets to xml using Export-PnPTermGroupToXml. After that, I imported the term sets again using Import-PnPTermGroupFromXml. That worked without the errors this time. So I needed to change my approach and convert the CSV-data to the XML-format that is used for these cmdlets.
For reference, the xml would look something like this (simplified example):
<pnp:TermGroups xmlns:pnp="http://schemas.dev.office.com/PnP/2022/09/ProvisioningSchema">
<pnp:TermGroup Name="Products" Description="Systems Information products">
<pnp:TermSets>
<pnp:TermSet Name="Product group 1" ID="f87e1b00-19ca-4f0d-8e7e-2cfa4f858499"
Description="">
<pnp:Terms>
<pnp:Term Name="Product series 1">
<pnp:Terms>
<pnp:Term Name="Product 1" />
<pnp:Term Name="Product 2" />
<pnp:Term Name="Product 3" />
</pnp:Terms>
</pnp:Terms>
</pnp:TermSet>
</pnp:TermSets>
</pnp:TermGroup>
</pnp:TermGroups>
To accomplish this, I needed to perform the following tasks in the script:
- Process Each Group: For each group of data, it processes and transforms the data into the appropriate XML structure.
- Read the CSV File: The script starts by importing data from a CSV file. This file contains mappings that need to be transformed.
- Create an XML Document: It then creates a new XML document. XML is a format that can be used to structure data in a way that other systems can understand.
- Define the Namespace: The script defines a ‘pnp’-namespace for the XML. This is like setting a context or a specific vocabulary that the XML will use.
- Add Root Element: It creates a root element called ‘TermGroups’ in the XML document. This is the top-level element that will contain all other elements.
- Add TermGroup Element: Within the root, it adds the ‘TermGroup’-element. This element represents a group of terms and includes attributes like name, ID, and description.
- Add TermSets Element: Inside the ‘TermGroup’, it adds the ‘TermSets’-element. This will contain individual term sets.
- Group Add the Term hierarchy: The script groups the data from the CSV and uses this to created a term structure with 3 layers.
You can image this being a complex set of tasks to perform, which would take me several hours to complete.
Github Copilot
Just before this challenge, I installed the Github Copilot extension (free plan). I already used it a couple of times in which I really liked how it looks at your own coding style and responds to your queries in the same way.

After struggling a bit with the tasks I had to implement in the script, I decided to write a prompt in the attempt to boost my productivity and quickly complete what I planned to do.
For starters, the script just contained the import of the CSV and the root-element of the xml-file. In the prompt, I detailed out what the scenario was and what I was trying to accomplish.
Conclusion
It took me just 5 small refinements in the prompt to have Copilot generate the relatively complex code I needed that ran without any errors and included all the typical logging functions that I always use.
The prompt (which had links to the csv-import-file and sample xml export-file) resembles the following (parts left out/replaced because of NDA):
I need to convert the ‘{inputfileName.csv}’ to term sets in SharePoint Online that can be imported using the Import-PnPTaxonomyFromXML cmdlet. The script should create an xml-file like the example in ‘sample.xml’. Each element should have a ‘pnp’-prefix like ‘pnp:TermGroups’ using e.g. $element.Prefix = ‘pnp’ and the termset should have a unique ID based on a generated GUID. There will be a single term group called ‘Products’.
The unique values for ‘Library’-value are the names of the termset (replacing ‘PRD-‘ with ”) and use the values of ‘1st-level Folder Name’, ‘2nd level Folder Name’ and ‘Model Name’ provide the term hierarchy with 3 levels based on unique ‘Model Name’-values. Use TrimStart to remove any leading spaces for the term values. If the value of the ‘2nd level Folder Name’ is $null use, ‘NOT’ as the value for the term.
This was mind-blowing and so much more efficient compared to the generic assistants I was used to. Github Copilot, as a code-specific AI-tool, was able to use the files and coding style in the script library to provide me with a fully working script in just 30 minutes. If I would have done this without Copilot, it would have taken me at least another 3 hours to complete.
I know there is so much more to explore on how to use this, but I’m excited as it is! And this is with the Free plan. So, if you are coding and not using Github Copilot yet, I can warmly recommend it.