Building a Windows Storage Dashboard, Part 2: Collecting Historical DataBuilding a Windows Storage Dashboard, Part 2: Collecting Historical Data
The second part of this three-part series on building a Windows storage dashboard provides a technique for gathering historical data.

[Editor's Note: This is Part 2 of a three-part series on building a Windows storage dashboard.]
In my first article in this series, I showed you how to build a really basic disk monitoring dashboard using PowerShell. Now, I want to start importing some historical data. However, we need a way of collecting that data.
Gathering historical data involves two main steps. The first step is to build a script that will retrieve and save the data that you are interested in. The second step is to configure the Windows Task Scheduler to run the script at various intervals so that you can watch how your storage consumption changes over time.
I have already written a detailed article about using the Windows Task Scheduler to run PowerShell scripts on a scheduled basis. That being the case, I'm not going to go into a detailed description of how to build a scheduled task. What I will tell you, however, is that when you create a scheduled task, you can't call your PowerShell script directly. Instead, the task will need to be configured to run PowerShell.exe, and the Add Arguments field needs to include the -File parameter, followed by the path and filename of the script that you want to run. The path and filename need to be enclosed in quotation marks. You can find more detailed information in my original article.
Storing the Data
For now, I want to focus on building the script that collects the storage data. One of the decisions you will have to make when building such a script is choosing where to store the data. There are several good options. You could easily write the data to a SQL Server database, a text file, or even a JSON file. However, the easiest option is probably going to be to write the data to a CSV file.
The script that I came up with is very short. Here is the script:
$LogFile = "C:\Scripts\DiskUsageHistory.csv"
Get-PSDrive -PSProvider 'FileSystem' | ForEach-Object {
[PSCustomObject]@{
Timestamp = (Get-Date)
Drive = $_.Name
UsedGB = "{0:N2}" -f ($_.Used / 1GB)
FreeGB = "{0:N2}" -f ($_.Free / 1GB)
TotalGB = "{0:N2}" -f (($_.Used + $_.Free) / 1GB)
}
} | Export-Csv -Path $LogFile -Append -NoTypeInformation
The first command in this script sets up a variable called $LogFile, which includes the full path and filename of the file where I am storing the logging data. In this case, that data is being stored in C:\Scripts\DiskUsageHistory.csv.
The next thing the script does is use the Get-PSDrive cmdlet to retrieve a list of the system's drives. It's worth noting, however, that using this command by itself causes PowerShell to retrieve more than just a list of disks. It also includes PowerShell's Variable, WSMan, Registry, Function, Alias, Certificate, and Environment drives. That being the case, I am using the -PSProvider parameter to filter the results so that only file system drives are included. This will cause the script to log both local disks and network drives.
The script then uses a ForEach loop to retrieve data for each of the drives discovered by the Get-PSDrive command. The loop creates a custom PowerShell object and then creates columns for Timestamp, Drive, UsedGB, FreeGB, and TotalGB. As was the case with the script that I showed you in Part 1 of this series, the used, free, and total disk space are expressed in gigabytes, rounded to two decimal places. I did this to prevent the results from being excessively long.
Exporting to CSV: Key Parameters
Once the custom objects have been created (the loop creates one custom object for each drive), the results are then piped to the Export-CSV cmdlet. This cmdlet is what actually creates the CSV file. You'll notice that I am using three parameters with this cmdlet.
The first parameter is -Path. This allows the script to write the CSV file to the path and filename contained within the $LogFile variable. The second parameter being used is -Append. This parameter causes new data to be written to the end of the existing CSV file (a new CSV file will be created if the file does not exist). Without this parameter, the CSV file would be overwritten every time the script is run.
The final parameter used is NoTypeInformation. This ensures that type information is not included in the file's header row. Creating the CSV file without type data makes the CSV file a little bit easier to work with.
Here is the data that was created when I ran the above script twice:
"Timestamp","Drive","UsedGB","FreeGB","TotalGB"
"5/30/2025 3:29:22 PM","C","387.48","1,474.53","1,862.01"
"5/30/2025 3:29:22 PM","D","0.00","0.00","0.00"
"5/30/2025 3:29:22 PM","E","0.00","0.00","0.00"
"5/30/2025 3:29:22 PM","F","0.00","0.00","0.00"
"5/30/2025 3:29:22 PM","G","0.00","0.00","0.00"
"5/30/2025 3:29:22 PM","H","0.00","0.00","0.00"
"5/30/2025 3:29:22 PM","M","7,885.12","2,114.76","9,999.87"
"5/30/2025 3:29:22 PM","Q","15,142.20","25,817.67","40,959.87"
"5/30/2025 3:29:22 PM","R","604.36","1,258.64","1,863.00"
"5/30/2025 3:29:22 PM","V","97.73","3,998.14","4,095.87"
"5/30/2025 3:29:22 PM","W","92.00","1,771.00","1,863.00"
"5/30/2025 3:29:52 PM","C","387.48","1,474.53","1,862.01"
"5/30/2025 3:29:52 PM","D","0.00","0.00","0.00"
"5/30/2025 3:29:52 PM","E","0.00","0.00","0.00"
"5/30/2025 3:29:52 PM","F","0.00","0.00","0.00"
"5/30/2025 3:29:52 PM","G","0.00","0.00","0.00"
"5/30/2025 3:29:52 PM","H","0.00","0.00","0.00"
"5/30/2025 3:29:52 PM","M","7,885.12","2,114.76","9,999.87"
"5/30/2025 3:29:52 PM","Q","15,142.20","25,817.67","40,959.87"
"5/30/2025 3:29:52 PM","R","604.36","1,258.64","1,863.00"
"5/30/2025 3:29:52 PM","V","97.73","3,998.14","4,095.87"
"5/30/2025 3:29:52 PM","W","92.00","1,771.00","1,863.00"
As you can see, the CSV file contains a header row along with a time stamp and the drive letter, used space, free space, and total capacity of each disk.
Now that I have shown you how to collect historical data for your disks, in Part 3 of this series, I will show you how to incorporate that data into the script that I started out with in Part 1.
About the Author
You May Also Like