Pages

Monday, March 28, 2022

Using PowerShell to send custom log data to Log Analytics for Azure Monitor alerting and Kusto Query

I’ve recently had to look into how the Certificates & secrets configured for their App Registrations could be monitored so administrators could be warned well ahead of the expiry so applications using these Enterprise Applications / service principals would not cease to work unexpectedly. Microsoft Azure unfortunately does not provide a native way for this monitoring (yet) but I managed to find a PowerShell script by Christopher Scott (https://techcommunity.microsoft.com/t5/core-infrastructure-and-security/app-registration-expiration-monitoring-and-notifications/ba-p/2043805) that uses the PowerShell cmdlets Get-AzADApplication, Get-AzADServicePrincipal, and Get-AzADAppCredential to extract the information and send them to Log Analytics. Chris does a great job walking through the steps but I did not completely understand the why and how every component work. I also could not get his script to work when used in an automation account as the data sent to Log Analytics would display all Certificates & secrets as expired and the start and end times were not displayed. Blindly using a script without understanding it isn’t something I recommend or do myself so I took the time to review as well as compare it to the sample script that Microsoft provides here: https://docs.microsoft.com/en-us/azure/azure-monitor/logs/data-collector-api#powershell-sample. One of the components that Chris’ script was different than Microsoft was how the signature and post to Log Analytics function was written so I decided to use Microsoft’s sample script for building the signature for authorization and sending data to log analytics, and Chris’ App Registration Certificates & secrets logic to obtain the data.

The purpose of this blog post is to demonstrate how one can send custom log data to Log Analytics by breaking down and understand the components in the finalized working script that captures the Certificates & secrets configured for their App Registrations and uses the HTTP Data Collector API to send log data to Azure Monitor from a PowerShell REST API call. The opportunities for sending data to a Log Analytics Workspace so it can be queried are limitless and going through this hands on learning exercise was very exciting for me.

Prerequisites

The components we’ll need for this script are as follows:

Az.Accounts and Az.Resources modules will be required for this script and the versions I have installed are:

Az.Accounts – 2.5.1
Az.Resources 4.2.0

image

An account with the Global Reader role, which we’ll use to interactively log onto Azure to retrieve the App Registration configuration details. Using a service principal to run this script is the obvious better choice so I will also include the PowerShell script that uses a Service Principal to log into Azure. It can be found here: https://github.com/terenceluk/Azure/blob/main/PowerShell/Get-AppRegistrationExpirationServicePrincipal.ps1

For the purpose of this post, I’ll continue with logging in interactively with Connect-AzAccount

Workspace ID: This is the Workspace ID of the Log Analytics that will be storing the data. The value can be found by navigating to the Log Analytics workspace > Agents management > Workspace ID

Shared Key: This is the Primary Key of the Log Analytics that will be storing the data. The value can be found by navigating to the Log Analytics workspace > Agents management > Primary key

The two variables above are similar to Storage Account access keys, where having them would allow us to send log analytics data into the workspace.

image

The full PowerShell script can be found at my GitHub repo: https://github.com/terenceluk/Azure/blob/main/PowerShell/Get-AppRegistrationExpirationInteractive.ps1

As well as pasted at the bottom of this post.

Functions

Two functions are required for this script.

The first function is used to build the signature that will be used as an authorization header when a request is sent to the Azure Monitor HTTP Data Collector API to POST data. This function requires the following parameters to build the signature:

  1. Workspace ID ($customerID) – The Workspace ID of the Log Analytics Workspace
  2. Primary Key ($sharedKey) – The Workspace Primary Key of the Log Analytics Workspace
  3. Date – The current date time
  4. Content Length – The character length of JSON formatted data we are sending to Log Analytics
  5. Method“POST” is the method that will be sent
  6. Content Type“application/json” is the content type that will be sent
  7. Resource“/api/logs” is the resource that will be sent

Once the above parameters are collected, the function will build the signature that will be used in the authorization header and return it to the function that will post data to the Log Analytics Workspace.

# The following function builds the signature used to authorization header that sends a request to the Azure Monitor HTTP Data Collector API

Function Build-Signature ($customerId, $sharedKey, $date, $contentLength, $method, $contentType, $resource)

{

$xHeaders = "x-ms-date:" + $date

$stringToHash = $method + "`n" + $contentLength + "`n" + $contentType + "`n" + $xHeaders + "`n" + $resource

$bytesToHash = [Text.Encoding]::UTF8.GetBytes($stringToHash)

$keyBytes = [Convert]::FromBase64String($sharedKey)

$sha256 = New-Object System.Security.Cryptography.HMACSHA256

$sha256.Key = $keyBytes

$calculatedHash = $sha256.ComputeHash($bytesToHash)

$encodedHash = [Convert]::ToBase64String($calculatedHash)

$authorization = 'SharedKey {0}:{1}' -f $customerId,$encodedHash

return $authorization

}

The next function is what will be used to actually send data to the log analytics workspace. This function receives the following parameters:

  1. Workspace ID ($customerID) – The Workspace ID of the Log Analytics Workspace
  2. Primary Key ($sharedKey) – The Workspace Primary Key of the Log Analytics Workspace
  3. Body – The data to be sent to the Log Analytics workspace in JSON format
  4. Log Type – The name of the log the data should be sent to

Once the above parameters are collected, the function will send the required parameters to the function that builds the signature for the authorization header (the function above), then uses inserts the required custom Log Analytics Workspace ID to build the URI / API endpoint (https://docs.microsoft.com/en-us/azure/azure-monitor/logs/data-collector-api#request-uri):

https://<CustomerId>.ods.opinsights.azure.com/api/logs?api-version=2016-04-01

It will then create the required header containing the authorization signature, the custom log name in the Log Analytics workspace, the current date and time, and a optional timestamp field that could be an empty string which will have Azure Monitor assume the time is the message ingestion time.

Finally, the cmdlet Invoke-WebRequest (https://docs.microsoft.com/en-us/powershell/module/microsoft.powershell.utility/invoke-webrequest?view=powershell-7.2) to send the HTTPS requests to the URI / API endpoint.

# The following function will create and post the request using the signature created by the Build-Signature function for authorization

Function Post-LogAnalyticsData($customerId, $sharedKey, $body, $logType)

{

$method = "POST"

$contentType = "application/json"

$resource = "/api/logs"

$rfc1123date = [DateTime]::UtcNow.ToString("r")

$contentLength = $body.Length

$signature = Build-Signature `

-customerId $customerId `

-sharedKey $sharedKey `

-date $rfc1123date `

-contentLength $contentLength `

-method $method `

-contentType $contentType `

-resource $resource

$uri = "https://" + $customerId + ".ods.opinsights.azure.com" + $resource + "?api-version=2016-04-01"

$headers = @{

"Authorization" = $signature;

"Log-Type" = $logType;

"x-ms-date" = $rfc1123date;

"time-generated-field" = $TimeStampField;

}

$response = Invoke-WebRequest -Uri $uri -Method $method -ContentType $contentType -Headers $headers -Body $body -UseBasicParsing

return $response.StatusCode

}

Here is an example of what the $headers variable would contain:

image

Connecting to Azure

The cmdlet Connect-AzAccount is used to interactively connect and authenticate the PowerShell session.

# Log in interactively with an account with Global Reader role

Connect-AzAccount

Defining Variables

As mentioned in the prerequisites, we’ll need the following variables assigned with their values.

  1. Workspace ID ($customerID) – The Workspace ID of the Log Analytics Workspace
  2. Primary Key ($sharedKey) – The Workspace Primary Key of the Log Analytics Workspace
  3. Log Type – The name of the log the data should be sent to
  4. Time Stamp Field – This variable is optional and we’ll be leaving it empty for this script

# Replace with your Workspace ID

$customerId = "b0d472a3-8c13-4cec-8abb-76051843545f"

# Replace with your Workspace Primary Key

$sharedKey = "D3s71+X0M+Q3cGTHC5I6H6l23xRNAKvjA+yb8JzMQQd3ntxeFZLmMWIMm7Ih/LPMOji9zkXDwavAJLX1xEe/4g=="

# Specify the name of the record type that you'll be creating (this is what will be displayed under Log Analytics > Logs > Custom Logs)

$LogType = "AppRegistrationExpiration"

# Optional name of a field that includes the timestamp for the data. If the time field is not specified, Azure Monitor assumes the time is the message ingestion time

$TimeStampField = ""

Obtaining list of App Registrations and Enterprise Applications / Service Principals

The list of App Registrations and Enterprise Applications / Service Principals for the Azure AD tenant will be retrieved and stored in variables.

# Get the full list of Azure AD App Registrations

$applications = Get-AzADApplication

Here is a sample of what the variable would contain:

image

# Get the full list of Azure AD Enterprise Applications (Service Principals)

$servicePrincipals = Get-AzADServicePrincipal

Here is a sample of what the variable would contain:

image

Filter for App Registrations that have Certificates & secrets configured

Next, an array will be created to store applications that have Certificates & secrets configured. Then the array will be populated with the following fields:

  1. DisplayName
  2. ObjectId
  3. ApplicationId
  4. KeyId
  5. Type
  6. StartDate
  7. EndDate

# Create an array named appWithCredentials

$appWithCredentials = @()

# Populate the array with app registrations that have credentials

# Retrieve the list of applications and sort them by DisplayName

$appWithCredentials += $applications | Sort-Object -Property DisplayName | % {

# Assign the variable application with the follow list of properties

$application = $_

# Retrieve the list of Enterprise Applications (Service Principals) and match the ApplicationID of the SP to the App Registration

$sp = $servicePrincipals | ? ApplicationId -eq $application.ApplicationId

Write-Verbose ('Fetching information for application {0}' -f $application.DisplayName)

# Use the Get-AzADAppCredential cmdlet to get the Certificates & secrets configured (this returns StartDate, EndDate, KeyID, Type, Usage, CustomKeyIdentifier)

# Populate the array with the DisplayName, ObjectId, ApplicationId, KeyId, Type, StartDate and EndDate of each Certificates & secrets for each App Registration

$application | Get-AzADAppCredential -ErrorAction SilentlyContinue | Select-Object `

-Property @{Name='DisplayName'; Expression={$application.DisplayName}}, `

@{Name='ObjectId'; Expression={$application.ObjectId}}, `

@{Name='ApplicationId'; Expression={$application.ApplicationId}}, `

@{Name='KeyId'; Expression={$_.KeyId}}, `

@{Name='Type'; Expression={$_.Type}},`

@{Name='StartDate'; Expression={$_.StartDate -as [datetime]}},`

@{Name='EndDate'; Expression={$_.EndDate -as [datetime]}}

}

Here is a sample what the array would contain:

image

Adding additional fields to specify whether certificate & secret has expired

It is possible to immediately send the information already collected to the Log Analytics workspace but Chris Scott too it one step further and appended additional fields for whether the certificate or secret was expired, the timestamp used to check the validity and the days until expiry. This can be accomplished while using Kusto query but I find this to be very handy to add.

# With the $application array populated with the Certificates & secrets and its App Registration, proceed to calculate and add the fields to each record in the array:

# Expiration of the certificate or secret - Valid or Expired

# Add the timestamp used to calculate the validity

# The days until the certificate or secret expires

Write-output 'Validating expiration data...'

$timeStamp = Get-Date -format o

$today = (Get-Date).ToUniversalTime()

$appWithCredentials | Sort-Object EndDate | % {

# First if catches certificates & secrets that are expired

if($_.EndDate -lt $today) {

$days= ($_.EndDate-$Today).Days

$_ | Add-Member -MemberType NoteProperty -Name 'Status' -Value 'Expired'

$_ | Add-Member -MemberType NoteProperty -Name 'TimeStamp' -Value "$timestamp"

$_ | Add-Member -MemberType NoteProperty -Name 'DaysToExpiration' -Value $days

# Second if catches certificates & secrets that are still valid

} else {

$days= ($_.EndDate-$Today).Days

$_ | Add-Member -MemberType NoteProperty -Name 'Status' -Value 'Valid'

$_ | Add-Member -MemberType NoteProperty -Name 'TimeStamp' -Value "$timestamp"

$_ | Add-Member -MemberType NoteProperty -Name 'DaysToExpiration' -Value $days

}

}

Here is a sample of what the array will contain:

image

Converting data to be sent to Log Analytics to JSON

The HTTP Data Collector API expects the data to be in JSON format so the collected information is converted:

# Convert the list of each Certificates & secrets for each App Registration into JSON format so we can send it to Log Analytics

$appWithCredentialsJSON = $appWithCredentials | convertto-json

## The following commented lines is a sample JSON that can be used to test sending data to Log Analytics

<#

$json = @"

[{

"DisplayName": "Vulcan O365 Audit Logs",

"ObjectId": "058f1297-ba80-4b9e-8f9c-15febdf85df0",

"ApplicationId": {

"value": "ac28a30a-6e5f-4c2d-9384-17bbb0809d57",

"Guid": "ac28a30a-6e5f-4c2d-9384-17bbb0809d57"

},

"KeyId": "2ea30e24-e2ad-44ff-865a-df07199f26a5",

"Type": "AsymmetricX509Cert",

"StartDate": "2021-05-29T18:26:46",

"EndDate": "2022-05-29T18:46:46"

},

{

"DisplayName": "Vulcan O365 Audit Logs",

"ObjectId": "058f1297-ba80-4b9e-8f9c-15febdf85df0",

"ApplicationId": {

"value": "ac28a30a-6e5f-4c2d-9384-17bbb0809d57",

"Guid": "ac28a30a-6e5f-4c2d-9384-17bbb0809d57"

},

"KeyId": "259dbc4d-cdde-4007-a9ed-887437560b15",

"Type": "AsymmetricX509Cert",

"StartDate": "2021-05-29T17:46:22",

"EndDate": "2022-05-29T18:06:22"

}]

"@

#>

Use the Post-LogAnalyticsData function to send the collected data to the Log Analytics Workspace

With the data collected, proceed to use the Post-LogAnalyticsData function to send the data:

# Submit the data to the API endpoint

Post-LogAnalyticsData -customerId $customerId -sharedKey $sharedKey -body ([System.Text.Encoding]::UTF8.GetBytes($appWithCredentialsJSON)) -logType $logType

Here is a successful POST with a return code of 200:

image

More information about the return codes can be found here: https://docs.microsoft.com/en-us/azure/azure-monitor/logs/data-collector-api#return-codes

------------------------------------------------------------------------------------------------------------

Note that it will take a bit of time before the data is displayed in the Log Analytics Workspace. I’ve found that I sometimes have to wait upwards to 15 minutes or more before it is displayed.

When it is displayed, you should see the a table with the log name you specified in Custom Logs:

image

You will also see the table listed under Custom Logs:

image

Querying the log will display the following:

image

The fields available as you scroll across are:

  1. TimeGenerated [UTC]
  2. Computer
  3. RawData
  4. DisplayName_s
  5. ObjectId_g
  6. ApplicationId_value_g
  7. ApplicationId_Guid_g
  8. KeyId_g
  9. Type_s
  10. StartDate_t [UTC]
  11. EndDate_t [UTC]
  12. Status_s
  13. TimeStamp_t [UTC]
  14. DaysToExpiration_d
  15. Type
  16. _ResourceId
  17. TenantId
  18. SourceSystem
  19. MG
  20. ManagementGroupName

You might be wondering the following:

Question: Why are there _s, _g, _d appended to some of the variables?

Answer: These are record types (string, Boolean, double, date/time, GUID) that are automatically added.

image

See the following link for more information: https://docs.microsoft.com/en-us/azure/azure-monitor/logs/data-collector-api#record-type-and-properties

Question: Why are there extra fields?

Answer: Some of them are reserved properties (e.g. tenant, TimeGenerated, RawData) and others are other default fields added.

------------------------------------------------------------------------------------------------------------

I hope this post provides a bit more information about how to send custom log data to a Log Analytics Workspace. The Microsoft documentation:

Send log data to Azure Monitor by using the HTTP Data Collector API (preview)
https://docs.microsoft.com/en-us/azure/azure-monitor/logs/data-collector-api

… does a fantastic job of explaining all the components in detail albeit a bit of a long read so I hope this blog post helps provide a shorten version of the mechanics.

I will be following up with another post that demonstrates how to automate the use of this script to collect App Registrations’ Certificates and Secrets expiration into Log Analytics, then use a Logic App to create and send a report out via email so stay tuned.

------------------------------------------------------------------------------------------------------------

# The following function builds the signature used to authorization header that sends a request to the Azure Monitor HTTP Data Collector API

Function Build-Signature ($customerId, $sharedKey, $date, $contentLength, $method, $contentType, $resource)

{

$xHeaders = "x-ms-date:" + $date

$stringToHash = $method + "`n" + $contentLength + "`n" + $contentType + "`n" + $xHeaders + "`n" + $resource

$bytesToHash = [Text.Encoding]::UTF8.GetBytes($stringToHash)

$keyBytes = [Convert]::FromBase64String($sharedKey)

$sha256 = New-Object System.Security.Cryptography.HMACSHA256

$sha256.Key = $keyBytes

$calculatedHash = $sha256.ComputeHash($bytesToHash)

$encodedHash = [Convert]::ToBase64String($calculatedHash)

$authorization = 'SharedKey {0}:{1}' -f $customerId,$encodedHash

return $authorization

}

# The following function will create and post the request using the signature created by the Build-Signature function for authorization

Function Post-LogAnalyticsData($customerId, $sharedKey, $body, $logType)

{

$method = "POST"

$contentType = "application/json"

$resource = "/api/logs"

$rfc1123date = [DateTime]::UtcNow.ToString("r")

$contentLength = $body.Length

$signature = Build-Signature `

-customerId $customerId `

-sharedKey $sharedKey `

-date $rfc1123date `

-contentLength $contentLength `

-method $method `

-contentType $contentType `

-resource $resource

$uri = "https://" + $customerId + ".ods.opinsights.azure.com" + $resource + "?api-version=2016-04-01"

$headers = @{

"Authorization" = $signature;

"Log-Type" = $logType;

"x-ms-date" = $rfc1123date;

"time-generated-field" = $TimeStampField;

}

$response = Invoke-WebRequest -Uri $uri -Method $method -ContentType $contentType -Headers $headers -Body $body -UseBasicParsing

return $response.StatusCode

}

# Log in interactively with an account with Global Reader role

Connect-AzAccount

# Replace with your Workspace ID

$customerId = "b0d472a3-8c13-4cec-8abb-76051843545f"

# Replace with your Workspace Primary Key

$sharedKey = "D3s71+X0M+Q3cGTHC5I6H6l23xRNAKvjA+yb8JzMQQd3ntxeFZLmMWIMm7Ih/LPMOji9zkXDwavAJLX1xEe/4g=="

# Specify the name of the record type that you'll be creating (this is what will be displayed under Log Analytics > Logs > Custom Logs)

$LogType = "AppRegistrationExpiration"

# Optional name of a field that includes the timestamp for the data. If the time field is not specified, Azure Monitor assumes the time is the message ingestion time

$TimeStampField = ""

# Get the full list of Azure AD App Registrations

$applications = Get-AzADApplication

# Get the full list of Azure AD Enterprise Applications (Service Principals)

$servicePrincipals = Get-AzADServicePrincipal

# Create an array named appWithCredentials

$appWithCredentials = @()

# Populate the array with app registrations that have credentials

# Retrieve the list of applications and sort them by DisplayName

$appWithCredentials += $applications | Sort-Object -Property DisplayName | % {

# Assign the variable application with the follow list of properties

$application = $_

# Retrieve the list of Enterprise Applications (Service Principals) and match the ApplicationID of the SP to the App Registration

$sp = $servicePrincipals | ? ApplicationId -eq $application.ApplicationId

Write-Verbose ('Fetching information for application {0}' -f $application.DisplayName)

# Use the Get-AzADAppCredential cmdlet to get the Certificates & secrets configured (this returns StartDate, EndDate, KeyID, Type, Usage, CustomKeyIdentifier)

# Populate the array with the DisplayName, ObjectId, ApplicationId, KeyId, Type, StartDate and EndDate of each Certificates & secrets for each App Registration

$application | Get-AzADAppCredential -ErrorAction SilentlyContinue | Select-Object `

-Property @{Name='DisplayName'; Expression={$application.DisplayName}}, `

@{Name='ObjectId'; Expression={$application.ObjectId}}, `

@{Name='ApplicationId'; Expression={$application.ApplicationId}}, `

@{Name='KeyId'; Expression={$_.KeyId}}, `

@{Name='Type'; Expression={$_.Type}},`

@{Name='StartDate'; Expression={$_.StartDate -as [datetime]}},`

@{Name='EndDate'; Expression={$_.EndDate -as [datetime]}}

}

# With the $application array populated with the Certificates & secrets and its App Registration, proceed to calculate and add the fields to each record in the array:

# Expiration of the certificate or secret - Valid or Expired

# Add the timestamp used to calculate the validity

# The days until the certificate or secret expires

Write-output 'Validating expiration data...'

$timeStamp = Get-Date -format o

$today = (Get-Date).ToUniversalTime()

$appWithCredentials | Sort-Object EndDate | % {

# First if catches certificates & secrets that are expired

if($_.EndDate -lt $today) {

$days= ($_.EndDate-$Today).Days

$_ | Add-Member -MemberType NoteProperty -Name 'Status' -Value 'Expired'

$_ | Add-Member -MemberType NoteProperty -Name 'TimeStamp' -Value "$timestamp"

$_ | Add-Member -MemberType NoteProperty -Name 'DaysToExpiration' -Value $days

# Second if catches certificates & secrets that are still valid

} else {

$days= ($_.EndDate-$Today).Days

$_ | Add-Member -MemberType NoteProperty -Name 'Status' -Value 'Valid'

$_ | Add-Member -MemberType NoteProperty -Name 'TimeStamp' -Value "$timestamp"

$_ | Add-Member -MemberType NoteProperty -Name 'DaysToExpiration' -Value $days

}

}

# Convert the list of each Certificates & secrets for each App Registration into JSON format so we can send it to Log Analytics

$appWithCredentialsJSON = $appWithCredentials | convertto-json

## The following commented lines is a sample JSON that can be used to test sending data to Log Analytics

<#

$json = @"

[{

"DisplayName": "Vulcan O365 Audit Logs",

"ObjectId": "058f1297-ba80-4b9e-8f9c-15febdf85df0",

"ApplicationId": {

"value": "ac28a30a-6e5f-4c2d-9384-17bbb0809d57",

"Guid": "ac28a30a-6e5f-4c2d-9384-17bbb0809d57"

},

"KeyId": "2ea30e24-e2ad-44ff-865a-df07199f26a5",

"Type": "AsymmetricX509Cert",

"StartDate": "2021-05-29T18:26:46",

"EndDate": "2022-05-29T18:46:46"

},

{

"DisplayName": "Vulcan O365 Audit Logs",

"ObjectId": "058f1297-ba80-4b9e-8f9c-15febdf85df0",

"ApplicationId": {

"value": "ac28a30a-6e5f-4c2d-9384-17bbb0809d57",

"Guid": "ac28a30a-6e5f-4c2d-9384-17bbb0809d57"

},

"KeyId": "259dbc4d-cdde-4007-a9ed-887437560b15",

"Type": "AsymmetricX509Cert",

"StartDate": "2021-05-29T17:46:22",

"EndDate": "2022-05-29T18:06:22"

}]

"@

#>

# Submit the data to the API endpoint

Post-LogAnalyticsData -customerId $customerId -sharedKey $sharedKey -body ([System.Text.Encoding]::UTF8.GetBytes($appWithCredentialsJSON)) -logType $logType

1 comment:

Anonymous said...

This is fantastic and really saved my butt, thank you so so much. I have a qeustion,

Is there a way to get information that lets you know who the owner is. (not the owner section in app reg) but from places like tags. For example if you add a tag that is owner username can you extract that. Or can you extract notes from the app registry Branding and Properties \internal notes section.

Also currently it will show the same apps expired every month. Is there a way to exclude apps that have already flagged from the month before?

Thanks for your time and effort on this.