When performing migrations in or to SharePoint Online, metrics are key in ensuring that the intended set of files and folders was fully copied over. This post covers a basic approach using PnP.PowerShell to get these metrics, even when working with large libraries storing 5,000 items or more.

Setting the stage

You’re working in a large scale migration where you need to measure whether the destination container in SharePoint Online or OneDrive for Business contains the expected amount of files folders and storage size. Depending on the specific scenario, you need to measure this for a complete document library or just for a folder and all the contents inside.

After connecting to the site, you first need to retrieve the list:

$list = Get-PnPList -Identity $listName -Connection $connection

#Only measure a specific folder in the library
$destinationFolder = 'Subfolder'

Depending on the scenario, you either measure the entire library or just a folder ($destinationFolder) by defining the $folderSiteRelativeURL variable:

if ([string]::IsNullOrEmpty($destinationFolder)) {

    #No destination folder, just use the rootfolder URL of the library
    [string]$folderSiteRelativeUrl = $list.RootFolder.Name

}
else {

    #Combine the rootfolder URL and destinationfolder
    [string]$folderSiteRelativeUrl = $list.RootFolder.Name + '/' + $destinationFolder

}

Measuring files and folders

To get the item count in terms of files and folders, you need to perform a query on the list. I used to work with the Collaborative Application Markup Language (CAML) This is still a very powerful way to create very specific queries. However, with the capacity of SharePoint lists, I would encounter issues when querying items located in specific folders when the list would contain over 5,000 items.
Of course, PnP.PowerShell has a solution for this.

Get-PnPFolderItem `
    -FolderSiteRelativeUrl $folderSiteRelativeUrl `
    -Recursive `
    -Connection $connection

So based on the scenario, the cmdlet will be used on a library level or a folder within. And it will recursively get all folders and files within when using the “Recursive” parameter.

Depending on the size of the library or folder, executing the cmdlet may take some time.
But, after that, it’s easy to get the data.

if ($null -ne $items) {

    $files = $items | Where-Object { $_.TypedObject -match 'File' } | Measure-Object
    $folders = $items | Where-Object { $_.TypedObject -match 'Folder' } | Measure-Object

}

Simply get the “File” or “Folder” objects from the query result and measure them. And potentially store the data in an object for futher use:

#Combine the data
[ordered]@{
    SPORelativeUrl = $folderSiteRelativeUrl
    SPOFiles        = [int]$files.Count
    SPOFolders      = [int]$folders.Count
}

Measuring folder size

Measuring the size in MB’s or GB’s from the library or folder is also not the challenging.
First, you need to retrieve the folder (or list) as an object:

#Get the folder object
$folder = Get-PnPFolder -Url $folderSiteRelativeUrl -Connection $connection

The object contains a child object called StorageMetrics that includes the TotalSize of the folder. Prior to using this, the child object needs to be loaded.

#Get the storage metrics
Get-PnPProperty `
    -ClientObject $folder `
    -Property 'StorageMetrics' `
    -Connection $connection

The output of the cmdlet includes several measures:

LastModified        : 21/12/2023 10:28:47
TotalFileCount      : 12
TotalFileStreamSize : 619405
TotalSize           : 632408

The “TotalSize” property is measured in KB’s, so depending on the reporting requirements, converting to MB’s or even GB’s might be in order.

And it can even be simpeler than that. When using the Get-PnPFolderStorageMetric cmdlet (release 2.10), the storage metrics are directly retrieved.

Conclusion

And that’s all there is to it. PnP.PowerShell remains awesome, as this example illustrates. The code snippets above would preferably be used in a function as part of an automated approach.

function Get-FactorySPOItemCount {
    [OutputType('PSCustomObject')]
    [CmdletBinding()]
    param(
        [Parameter(Mandatory = $true)]
        [string]$listName,
        [Parameter(Mandatory = $false)]
        [string]$destinationFolder,
        [Parameter(Mandatory = $true)]
        $connection)

    $list = Get-PnPList -Identity $listName -Connection $connection

    #Proceed if list is found
    if ($null -ne $list) {

        #Set the relative path
        if ([string]::IsNullOrEmpty($destinationFolder)) {

            [string]$folderSiteRelativeUrl = $list.RootFolder.Name

        }
        else {

            [string]$folderSiteRelativeUrl = $list.RootFolder.Name + '/' + $destinationFolder

        }

        #Set the basic request parameters
        $requestParams = @{
            FolderSiteRelativeUrl = $folderSiteRelativeUrl
            Recursive             = $true
            Connection            = $connection
        }

        $items = Get-PnPFolderItem @requestParams

        #Calculate objects
        if ($null -ne $items) {

            $files = $items | Where-Object { $_.TypedObject -match 'File' } | Measure-Object
            $folders = $items | Where-Object { $_.TypedObject -match 'Folder' } | Measure-Object

        }

        #Combine the data
        [ordered]@{
            SPORelativeUrl = $folderSiteRelativeUrl
            SPOFiles       = [int]$files.Count
            SPOFolders     = [int]$folders.Count
        }
    }
    else {

        Write-Warning "list does not exist ($listName)"

    }
}

After a migration completes, it would be used as part of the proof of a migration.