The Great Disk Space Hunt: Why Azure Resource Graph Won’t Tell You Everything
- Shannon
- 8 minutes ago
- 5 min read
Like a lot of my blog posts as of late, I get asked all sorts of questions by customers struggling to make sense of Azure and in this case, it all started innocently enough: A customer asked me a seemingly simple question:
“Hey, how do I find out how much actual used disk space our Azure VMs have?”
Ah, the classic question about how much storage is left. I smiled, nodded, and thought, this is going to be easy. Azure Resource Graph has all the answers.
Spoiler alert: It does not.
Step 1: Thinking Resource Graph Would Save the Day
I imagined it would be straightforward: query Azure Resource Graph, pull the VM list, and magically see all used and free space across all disks.
I typed some queries, hoping to see columns like UsedSpaceGB and FreeSpaceGB. I even did a little victory dance when Intellisense suggested StorageProfile and OSDisk.
Then reality hit.
Azure Resource Graph shows disk provisioned sizes, not actual used space.
It is like looking at a pizza box and trying to guess how much pizza is left inside. You can see the box is 12 inches, but there is no way to know if someone already ate half the slices.
So much for that shortcut. #backtothedrawingboard
Step 2: Enter KQL — My Cloud Whisperer
If Azure Resource Graph was the pizza box, KQL in Log Analytics is like asking the chef:
“Hey, how many slices are gone and how many are left?”
By querying VM Insights with KQL, I was able to dig into metrics like FreeSpaceMB and FreeSpacePercentage. Then, with a little magic wand waiving, I calculated used space in GB across all drives reporting to a single Log Analytics workspace.
Here is the key snippet that did the trick for anyone that has 1 main Log Analytics workspace:
InsightsMetrics
| where Origin == "vm.azm.ms" and Namespace == "LogicalDisk"
| where Name in ("FreeSpaceMB", "FreeSpacePercentage")
| extend Disk=tostring(todynamic(Tags)["vm.azm.ms/mountId"])
| summarize arg_max(TimeGenerated, Val) by Computer, Name, Disk, _ResourceId
| extend Packed = bag_pack(Name, Val)
| summarize Packed = make_bag(Packed) by TimeGenerated, Computer, Disk, _ResourceId
| evaluate bag_unpack(Packed)
| extend UsedSpaceGB = ceiling((todecimal(FreeSpaceMB) / (FreeSpacePercentage / todecimal(100)) - FreeSpaceMB) / 1024), FreeSpaceGB = FreeSpaceMB / 1024
Suddenly, I had the numbers I needed in my testing environment: used space, free space, and a sanity check that the VM was not silently filling up like an unattended pot of soup. The caveat here is you must make sure you have a Log Analytics workspace and that you've turned on VM Insights.
Also, what if you have multiple Log Analytics workspaces? Thankfully you can lean cross workspace queries to still grab the same data, just across Log Analytics workspaces! Here's how:
union
workspace("workspace-id-1").InsightsMetrics,
workspace("workspace-id-2").InsightsMetrics,
workspace("workspace-id-3").InsightsMetrics
| where Origin == "vm.azm.ms" and Namespace == "LogicalDisk"
| where Name in ("FreeSpaceMB", "FreeSpacePercentage")
| extend Disk=tostring(todynamic(Tags)["vm.azm.ms/mountId"])
| summarize arg_max(TimeGenerated, Val) by Computer, Name, Disk, _ResourceId
| extend Packed = bag_pack(Name, Val)
| summarize Packed = make_bag(Packed) by TimeGenerated, Computer, Disk, _ResourceId
| evaluate bag_unpack(Packed)
| extend UsedSpaceGB = ceiling((todecimal(FreeSpaceMB) / (FreeSpacePercentage / todecimal(100)) - FreeSpaceMB) / 1024), FreeSpaceGB = FreeSpaceMB / 1024
| project TimeGenerated, Computer, Disk, _ResourceId, UsedSpaceGB, FreeSpaceGB
Step 3: PowerShell — The Hands-On Detective
Sometimes KQL may not be enough. In this case, the customer hadn't yet enabled Log Analytics with VM Insights for everything in Azure and was predominantly a Windows environment. They also had not yet ported over their enterprise observability tooling into Azure (another spot to grab disk utilization). Other factors to possibly include (but not be limited to in your quest to come up with answers): maybe the there are on-prem VMs, or some machines are not reporting properly to Log Analytics. Enter PowerShell.
With a few commands, I was able to:
Pull all Windows servers from Active Directory (note, that can be swapped for Azure using the Az PowerShell moduleif needed)
Loop through each server
Query all local drives
Calculate total, free, and used space in GB
Export results to CSV
<#
.SYNOPSIS
Retrieves disk usage information from Windows VMs and exports it to a CSV file.
.DESCRIPTION
This script queries all Windows VMs from a chosen source (Active Directory or
Azure), collects usage on all local disks, calculates total, free, and used
space in GB, and exports the results to a CSV file for reporting or analysis.
It tests connectivity to each VM before querying and handles errors gracefully.
Supports parallel execution in PowerShell 7 for faster performance on large
environments.
.PARAMETER ExportFile
Full file path where the CSV output should be saved.
.PARAMETER Source
Specifies the source for VM discovery. Acceptable values are:
- "AD" : Active Directory
- "Azure" : Azure subscription
.NOTES
Author: Shannon Eldridge-Kuehn
Date: 2025-08-16
Requirements:
- Active Directory module for AD queries
- Az module for Azure queries (Install-Module Az)
- PowerShell 5.1 or higher; PowerShell 7 recommended for parallel execution
#>
param (
[Parameter(Mandatory=$true)]
[ValidateSet("AD", "Azure")]
[string]$Source,
[Parameter(Mandatory=$true)]
[string]$ExportFile
)
# Initialize details array
$details = @()
# Discover VMs based on the selected source
switch ($Source) {
"AD" {
Write-Host "Fetching Windows VMs from Active Directory..." -ForegroundColor Cyan
Import-Module ActiveDirectory -ErrorAction Stop
$servers = Get-ADComputer -Filter {OperatingSystem -like "*Windows*"} | Select-Object `
-ExpandProperty Name
}
"Azure" {
Write-Host "Fetching Windows VMs from Azure..." -ForegroundColor Cyan
Import-Module Az -ErrorAction Stop
Connect-AzAccount -ErrorAction Stop
$servers = Get-AzVM | Where-Object {$_.StorageProfile.OsDisk.OsType -eq "Windows"} | ` Select-Object -ExpandProperty Name
}
}
# Check if any servers were found
if (-not $servers -or $servers.Count -eq 0) {
Write-Warning "No servers found. Exiting script."
return
}
Write-Host "Found $($servers.Count) servers. Starting disk query..." -ForegroundColor Green
# Loop through each server
foreach ($server in $servers) {
try {
if (Test-Connection -ComputerName $server -Count 1 -Quiet) {
# Query local disks (DriveType 3)
$disks = Get-CimInstance -ClassName Win32_LogicalDisk -ComputerName $server | `
Where-Object { $_.DriveType -eq 3 }
foreach ($disk in $disks) {
$details += [PSCustomObject]@{
VMName = $server
DriveLetter = $disk.DeviceID
TotalSizeGB = [math]::Round($disk.Size / 1GB, 2)
FreeSpaceGB = [math]::Round($disk.FreeSpace / 1GB, 2)
UsedSpaceGB = [math]::Round(($disk.Size - $disk.FreeSpace) / 1GB, 2)
}
}
} else {
Write-Warning "Server $server is not reachable."
}
} catch {
Write-Error "Failed to query $server: $_"
}
}
# Output results to console, comment out if CSV is preferred
$details | Format-Table -AutoSize
# Export to CSV, comment out if console export is preferred
$details | Export-Csv -Path $ExportFile -NoTypeInformation
Write-Host "Disk usage report exported to $ExportFile" -ForegroundColor Green
PowerShell is like sending a personal intern into each VM with a clipboard, tallying every byte. It is hands-on, precise, and impossible to argue with.
Step 4: The Lesson
Here is the moral of the story:
Azure Resource Graph can tell you how big the box is.
KQL / VM Insights can tell you how much pizza is left.
PowerShell / WMI can walk the kitchen, check every plate, and report back.
When a customer asks for actual used space, there is no single Azure solution and context matters. You may have to combine cloud metrics with on-the-ground inspection and embrace a little scripting wizardry.
Next time someone asks, “How much storage do we have left?”, you can confidently say:
“Let me check the boxes, ask the chef, and if needed, send our trusty intern to count every slice.”
Now, just in case the code is trickier to follow up above (my hosting provider has arguably made some of this better and worse simultaneously), here are some GitHub links: Disk Space Utilization - Single Log Analytics Workspace (KQL)
Comments