Pages

Wednesday, March 30, 2022

Using a Logic App to send a recurring report for App Registration Certificates & secrets expiry

As a follow up to my previous post:

Using PowerShell to send custom log data to Log Analytics for Azure Monitor alerting and Kusto Query
http://terenceluk.blogspot.com/2022/03/using-powershell-to-send-custom-log.html

I recently looked into how the Certificates & secrets configured for Azure AD App Registrations could be monitored so administrators could be warned ahead of the expiry because of a recommendation I proposed for a client where their Azure team would experience outages due to expired credentials. This client had hundreds of partners that authenticate against their Azure AD and the unplanned downtimes were a major pain point.

As indicated in my previous post and for those who have attempted this will know that Azure does not provide a native built-in method for monitoring them but there are two solutions provided by the following professionals:

App Registration Expiration Monitoring and Notifications – By Christopher Scott
https://techcommunity.microsoft.com/t5/core-infrastructure-and-security/app-registration-expiration-monitoring-and-notifications/ba-p/2043805

Use Power Automate to Notify of Upcoming Azure AD App Client Secrets and Certificate Expirations – By Russ Rimmerman
https://techcommunity.microsoft.com/t5/core-infrastructure-and-security/use-power-automate-to-notify-of-upcoming-azure-ad-app-client/ba-p/2406145

I liked the solution that Christopher Scott created so I went ahead and tested it but ran into an issue caused by the updated Az.Resources module as described in this post:

Attempting to use Chris Scott's App Registration Expiration Monitoring and Notifications displays data with "DaysToExpiration" set to "-738,241"
http://terenceluk.blogspot.com/2022/03/attempting-to-use-chris-scotts-app.html

I also noticed that the kusto query he provided did not cover App Registrations that had more than one secret or certificate configured. Lastly, I preferred to have a pre-scheduled report sent out with a list of expiring certificates and secrets so this post serves to outline the configuration and some minor tweaks to this great solution.

Create Log Analytics Workspace

Begin by creating a Log Analytics Workspace that will store the Azure Sentinel streamed Active Directory Domain Controller logs:

image

With the Log Analytics workspace created, navigate to the Agents management blade to collect the following information:

  1. Workspace ID
  2. Workspace Primary ID

These two parameters will be required for setting up the Automation Account so it can send custom Log Analytics data to this workspace.

image

Create and Configure App Registration for Automation Account Runbook

The Automation Account Runbook that we’ll be creating shortly will be using a service principal to execute a PowerShell script that will collect every App Registration and Enterprise Applications / Service Principal along with their configured certificates and secrets. This service principal will need Global Reader rights to obtain the information.

**Note that we can also use a Managed Identity for the execution of the PowerShell script as shown in one of my previous blog posts: http://terenceluk.blogspot.com/2022/02/using-automation-account-to-monitor-vms.html

image

With the App Registration created, document the following information:

  1. Application (client) ID <- this represents the service principal’s user name
  2. Directory (tenant) ID
image

Proceed to create a client secret in the Certificates & secrets blade and document the value of the secret before navigating away from the page as it would not be displayed again:

image

Next, navigate to the Roles and administrators blade for the Azure AD tenant, search for Global Reader and add the service principal to the role:

image

image

Create Automation Account

With the prerequisites for the Automation Account configured, the next is to create the new Automation Account that will use a PowerShell script to extract the App Registrations along with the configured certificates and secrets, and their expiry dates to send into the Log Analytics Workspace:

image

image

Configure Variables and Credential Automation Account

A runbook using a PowerShell script will be used to obtain the certificates and secrets of App Registrations by authenticating against Azure AD and accessing the Log Analytics Workspace to send data. It is best practice to store variables and credentials outside of the script so we’ll store them securely within the Variables and Credentials blade of the Automation Account.

Navigate to the Variables blade and configure the following 3 variables:

  1. MonitoredTenantID
  2. WorkspaceID
  3. WorkspacePrimaryKey

It is possible to store these variables as encrypted to further protect the values and for the purpose of this example, I will only store the Workspace Primary Key as encrypted:

imageimage

Navigate to the Credentials blade and configure the App Registration / Service Principal credential that the PowerShell script will use to authenticate against Azure AD:

image

image

Create and Configure an Automation Account Runbook

With the credentials and variables for the PowerShell script configured, navigate to the Runbooks blade and create a new runbook:

image

Provide the following:

Name: <name of choice>
Runbook type: <PowerShell>
Runtime version: <5.1>
Description: <description of choice>

image

You will be brought into Edit window of the Runbook after creating it:

image

Proceed to paste the following PowerShell script into the window: https://github.com/terenceluk/Azure/blob/main/PowerShell/Get-AppRegistrationExpirationAutomation.ps1

A more detailed breakdown of what the script does can be found in my previous blog post: http://terenceluk.blogspot.com/2022/03/using-powershell-to-send-custom-log.html

I will also include the PowerShell script at the end of this blog post.

image

Note how the script references the credentials and the variables we defined earlier for the Automation Account:

image

Use the Test pane feature to execute a test and verify that the script runs without any errors:

image

A return code of 200 is what we’re looking for:

image

Proceed to publish the script:

image

Next, schedule the Runbook to run accordingly to a day, time, and frequency of your choice:

image

With the runbook configured, proceed to test executing the Runbook and confirm that the desired data is being sent to the Log Analytics Workspace:

image

image

Set up reporting with Logic Apps

With the Automation Account and Runbook successfully set up and verified, proceed to create a new Logic App that will query the data in the Log Analytics Workspace, generate a HTML report and send it to the desired email address.

Navigate into the Logic app and create a new Logic App:

image

Once the Logic App has been created, click on Logic app designer:

image

We’ll be creating 3 steps for this Logic App where:

  1. Recurrence: This will configure a recurring schedule for this Logic App to execute
  2. Run query and visualize results: This will allow us to run the Kusto query, set a Time Range and specify a Chart Type
  3. Send an email (V2): This will allow us to send the Kusto query results via email
image

Recurrence:

Configure the desired frequency of this Logic App:

image

Run query and visualize results:

Fill in the following fields:

Subscription: <your subscription>
Resource Group: <desired resource group>
Resource Type: Log Analytics Workspace
Resource Name: The Log Analytics Workspace containing the custom log data>
Query: See the query in my GitHub: https://github.com/terenceluk/Azure/blob/main/Kusto%20KQL/Get%20Expiring%20and%20Expired%20App%20Reg%20Certs%20and%20Secrets.kusto

AppRegistrationExpiration_CL

| summarize arg_max(TimeGenerated,*) by KeyId_g // arg_max is used to return the latest record for the time range selected and the * is to return all columns, records with unique KeyId_g will be returned so expired and multiple credentials are returned

| where DaysToExpiration_d <= 1000 or Status_s == "Expired" // this specifies a filter for the amount of days before expiry

//| where TimeGenerated > ago(1d) // the TimeGenerated value must be within a day

| project TimeGenerated, Display_Name = DisplayName_s, Application_Client_ID = ApplicationId_Guid_g, Object_ID = ObjectId_g, Cert_or_Secret_ID = KeyId_g, Credential_Type = Type_s, Start_Date = StartDate_value_t, End_Date = EndDate_value_t, Expiration_Status = Status_s, Days_To_Expire = DaysToExpiration_d, Directory_Tenant_ID = TenantId // the columns have been renamed to easier to understand headings

Time Range: Last 24 hours
Chart Type: HTML Table

image

Let’s break down the Kusto query line by line:

Look up data in the following log:
AppRegistrationExpiration_CL

Retrieve only the latest records based on the TimeGenerated field using the unique KeyID_g field that represents the certificate or secret ID as a filter (this will cover App Registrations that have multiple certificates or secrets configured):
| summarize arg_max(TimeGenerated,*) by KeyId_g // arg_max is used to return the latest record for the time range selected and the * is to return all columns, records with unique KeyId_g will be returned so expired and multiple credentials are returned

Only list records where the certificate or secret will be expiring in 50 days or if it has already expired:
| where DaysToExpiration_d <= 50 or Status_s == "Expired" // this specifies a filter for the amount of days before expiry

Only list records with the TimeGenerated field within a day:
//| where TimeGenerated > ago(1d) // the TimeGenerated value must be within a day

Project the fields of interest for the report and rename them to more meaningful names:
| project TimeGenerated, Display_Name = DisplayName_s, Application_Client_ID = ApplicationId_Guid_g, Object_ID = ObjectId_g, Secret_ID = KeyId_g, Credential_Type = Type_s, Start_Date = StartDate_value_t, End_Date = EndDate_value_t, Expiration_Status = Status_s, Days_To_Expire = DaysToExpiration_d, Directory_Tenant_ID = TenantId // the columns have been renamed to easier to understand headings

Send an email (V2):

The report will include the Attachment Content and Attachment Name derived from the query with the subject Certificate & Secrets Expiry Report. The email will look pretty barebone so you are free to add HTML code to pretty it up.

image

Proceed to save the Logic App and use the Run Trigger to test the Logic App and confirm that an email is sent: 

**Note that I have set the kusto query to return records with an expiry of 1000 days or less so more records would return.

image

Hope this helps anyone who may be looking for a way to set up a Logic App for sending recurring reports of expiring App Registrations’ Certificates and Secrets information that were sent to a custom Log Analytics data.

Attempting to use Chris Scott's App Registration Expiration Monitoring and Notifications displays data with "DaysToExpiration" set to "-738,241"

As described in my previous post:

Using PowerShell to send custom log data to Log Analytics for Azure Monitor alerting and Kusto Query
http://terenceluk.blogspot.com/2022/03/using-powershell-to-send-custom-log.html

I had difficulty using Christopher Scott’s script to work when used in an automation account as the data sent to Log Analytics would display all Certificates & secrets as expired and the start and end times were not displayed and ended up spending most of my weekend troubleshooting why so in an effort to help anyone who may encounter the same issue as I did, this will be a quick blog post writeup that will outline the symptoms and resolution.

Problem

You attempt to use Christopher Scott’s PowerShell script (https://techcommunity.microsoft.com/t5/core-infrastructure-and-security/app-registration-expiration-monitoring-and-notifications/ba-p/2043805) to extract App Registration’s Certificates & secrets to send to Log Analytics but noticed that the data returned when querying for displays:

  1. All certificates and secrets as Expired
  2. The DaysToExpiration are all set to -738,241
  3. The StartDate and EndDate fields are expired
image

Solution

What was determined after troubleshooting this was that the default Az.Resources module for the Automation Account was version: 5.4.0.

image

It appears the cmdlet Get-AzADServicePrincipal and Get-AzADAppCredential returns null for a lot of the fields in this newer version.

To fix this issue, try downgrading it to 4.2.0 by downloading the nupkg package here:

https://www.powershellgallery.com/packages/Az.Resources/2.5.0

Rename the extension from nupkg to zip.

Import the package in the Modules of the Automation Account:

image

image

image

Try running the Runbook again to confirm that the data sent to Log Analytics is displayed properly as such:

image

Monday, March 28, 2022

Using PowerShell to send custom log data to Log Analytics for Azure Monitor alerting and Kusto Query

I’ve recently had to look into how the Certificates & secrets configured for their App Registrations could be monitored so administrators could be warned well ahead of the expiry so applications using these Enterprise Applications / service principals would not cease to work unexpectedly. Microsoft Azure unfortunately does not provide a native way for this monitoring (yet) but I managed to find a PowerShell script by Christopher Scott (https://techcommunity.microsoft.com/t5/core-infrastructure-and-security/app-registration-expiration-monitoring-and-notifications/ba-p/2043805) that uses the PowerShell cmdlets Get-AzADApplication, Get-AzADServicePrincipal, and Get-AzADAppCredential to extract the information and send them to Log Analytics. Chris does a great job walking through the steps but I did not completely understand the why and how every component work. I also could not get his script to work when used in an automation account as the data sent to Log Analytics would display all Certificates & secrets as expired and the start and end times were not displayed. Blindly using a script without understanding it isn’t something I recommend or do myself so I took the time to review as well as compare it to the sample script that Microsoft provides here: https://docs.microsoft.com/en-us/azure/azure-monitor/logs/data-collector-api#powershell-sample. One of the components that Chris’ script was different than Microsoft was how the signature and post to Log Analytics function was written so I decided to use Microsoft’s sample script for building the signature for authorization and sending data to log analytics, and Chris’ App Registration Certificates & secrets logic to obtain the data.

The purpose of this blog post is to demonstrate how one can send custom log data to Log Analytics by breaking down and understand the components in the finalized working script that captures the Certificates & secrets configured for their App Registrations and uses the HTTP Data Collector API to send log data to Azure Monitor from a PowerShell REST API call. The opportunities for sending data to a Log Analytics Workspace so it can be queried are limitless and going through this hands on learning exercise was very exciting for me.

Prerequisites

The components we’ll need for this script are as follows:

Az.Accounts and Az.Resources modules will be required for this script and the versions I have installed are:

Az.Accounts – 2.5.1
Az.Resources 4.2.0

image

An account with the Global Reader role, which we’ll use to interactively log onto Azure to retrieve the App Registration configuration details. Using a service principal to run this script is the obvious better choice so I will also include the PowerShell script that uses a Service Principal to log into Azure. It can be found here: https://github.com/terenceluk/Azure/blob/main/PowerShell/Get-AppRegistrationExpirationServicePrincipal.ps1

For the purpose of this post, I’ll continue with logging in interactively with Connect-AzAccount

Workspace ID: This is the Workspace ID of the Log Analytics that will be storing the data. The value can be found by navigating to the Log Analytics workspace > Agents management > Workspace ID

Shared Key: This is the Primary Key of the Log Analytics that will be storing the data. The value can be found by navigating to the Log Analytics workspace > Agents management > Primary key

The two variables above are similar to Storage Account access keys, where having them would allow us to send log analytics data into the workspace.

image

The full PowerShell script can be found at my GitHub repo: https://github.com/terenceluk/Azure/blob/main/PowerShell/Get-AppRegistrationExpirationInteractive.ps1

As well as pasted at the bottom of this post.

Functions

Two functions are required for this script.

The first function is used to build the signature that will be used as an authorization header when a request is sent to the Azure Monitor HTTP Data Collector API to POST data. This function requires the following parameters to build the signature:

  1. Workspace ID ($customerID) – The Workspace ID of the Log Analytics Workspace
  2. Primary Key ($sharedKey) – The Workspace Primary Key of the Log Analytics Workspace
  3. Date – The current date time
  4. Content Length – The character length of JSON formatted data we are sending to Log Analytics
  5. Method“POST” is the method that will be sent
  6. Content Type“application/json” is the content type that will be sent
  7. Resource“/api/logs” is the resource that will be sent

Once the above parameters are collected, the function will build the signature that will be used in the authorization header and return it to the function that will post data to the Log Analytics Workspace.

# The following function builds the signature used to authorization header that sends a request to the Azure Monitor HTTP Data Collector API

Function Build-Signature ($customerId, $sharedKey, $date, $contentLength, $method, $contentType, $resource)

{

$xHeaders = "x-ms-date:" + $date

$stringToHash = $method + "`n" + $contentLength + "`n" + $contentType + "`n" + $xHeaders + "`n" + $resource

$bytesToHash = [Text.Encoding]::UTF8.GetBytes($stringToHash)

$keyBytes = [Convert]::FromBase64String($sharedKey)

$sha256 = New-Object System.Security.Cryptography.HMACSHA256

$sha256.Key = $keyBytes

$calculatedHash = $sha256.ComputeHash($bytesToHash)

$encodedHash = [Convert]::ToBase64String($calculatedHash)

$authorization = 'SharedKey {0}:{1}' -f $customerId,$encodedHash

return $authorization

}

The next function is what will be used to actually send data to the log analytics workspace. This function receives the following parameters:

  1. Workspace ID ($customerID) – The Workspace ID of the Log Analytics Workspace
  2. Primary Key ($sharedKey) – The Workspace Primary Key of the Log Analytics Workspace
  3. Body – The data to be sent to the Log Analytics workspace in JSON format
  4. Log Type – The name of the log the data should be sent to

Once the above parameters are collected, the function will send the required parameters to the function that builds the signature for the authorization header (the function above), then uses inserts the required custom Log Analytics Workspace ID to build the URI / API endpoint (https://docs.microsoft.com/en-us/azure/azure-monitor/logs/data-collector-api#request-uri):

https://<CustomerId>.ods.opinsights.azure.com/api/logs?api-version=2016-04-01

It will then create the required header containing the authorization signature, the custom log name in the Log Analytics workspace, the current date and time, and a optional timestamp field that could be an empty string which will have Azure Monitor assume the time is the message ingestion time.

Finally, the cmdlet Invoke-WebRequest (https://docs.microsoft.com/en-us/powershell/module/microsoft.powershell.utility/invoke-webrequest?view=powershell-7.2) to send the HTTPS requests to the URI / API endpoint.

# The following function will create and post the request using the signature created by the Build-Signature function for authorization

Function Post-LogAnalyticsData($customerId, $sharedKey, $body, $logType)

{

$method = "POST"

$contentType = "application/json"

$resource = "/api/logs"

$rfc1123date = [DateTime]::UtcNow.ToString("r")

$contentLength = $body.Length

$signature = Build-Signature `

-customerId $customerId `

-sharedKey $sharedKey `

-date $rfc1123date `

-contentLength $contentLength `

-method $method `

-contentType $contentType `

-resource $resource

$uri = "https://" + $customerId + ".ods.opinsights.azure.com" + $resource + "?api-version=2016-04-01"

$headers = @{

"Authorization" = $signature;

"Log-Type" = $logType;

"x-ms-date" = $rfc1123date;

"time-generated-field" = $TimeStampField;

}

$response = Invoke-WebRequest -Uri $uri -Method $method -ContentType $contentType -Headers $headers -Body $body -UseBasicParsing

return $response.StatusCode

}

Here is an example of what the $headers variable would contain:

image

Connecting to Azure

The cmdlet Connect-AzAccount is used to interactively connect and authenticate the PowerShell session.

# Log in interactively with an account with Global Reader role

Connect-AzAccount

Defining Variables

As mentioned in the prerequisites, we’ll need the following variables assigned with their values.

  1. Workspace ID ($customerID) – The Workspace ID of the Log Analytics Workspace
  2. Primary Key ($sharedKey) – The Workspace Primary Key of the Log Analytics Workspace
  3. Log Type – The name of the log the data should be sent to
  4. Time Stamp Field – This variable is optional and we’ll be leaving it empty for this script

# Replace with your Workspace ID

$customerId = "b0d472a3-8c13-4cec-8abb-76051843545f"

# Replace with your Workspace Primary Key

$sharedKey = "D3s71+X0M+Q3cGTHC5I6H6l23xRNAKvjA+yb8JzMQQd3ntxeFZLmMWIMm7Ih/LPMOji9zkXDwavAJLX1xEe/4g=="

# Specify the name of the record type that you'll be creating (this is what will be displayed under Log Analytics > Logs > Custom Logs)

$LogType = "AppRegistrationExpiration"

# Optional name of a field that includes the timestamp for the data. If the time field is not specified, Azure Monitor assumes the time is the message ingestion time

$TimeStampField = ""

Obtaining list of App Registrations and Enterprise Applications / Service Principals

The list of App Registrations and Enterprise Applications / Service Principals for the Azure AD tenant will be retrieved and stored in variables.

# Get the full list of Azure AD App Registrations

$applications = Get-AzADApplication

Here is a sample of what the variable would contain:

image

# Get the full list of Azure AD Enterprise Applications (Service Principals)

$servicePrincipals = Get-AzADServicePrincipal

Here is a sample of what the variable would contain:

image

Filter for App Registrations that have Certificates & secrets configured

Next, an array will be created to store applications that have Certificates & secrets configured. Then the array will be populated with the following fields:

  1. DisplayName
  2. ObjectId
  3. ApplicationId
  4. KeyId
  5. Type
  6. StartDate
  7. EndDate

# Create an array named appWithCredentials

$appWithCredentials = @()

# Populate the array with app registrations that have credentials

# Retrieve the list of applications and sort them by DisplayName

$appWithCredentials += $applications | Sort-Object -Property DisplayName | % {

# Assign the variable application with the follow list of properties

$application = $_

# Retrieve the list of Enterprise Applications (Service Principals) and match the ApplicationID of the SP to the App Registration

$sp = $servicePrincipals | ? ApplicationId -eq $application.ApplicationId

Write-Verbose ('Fetching information for application {0}' -f $application.DisplayName)

# Use the Get-AzADAppCredential cmdlet to get the Certificates & secrets configured (this returns StartDate, EndDate, KeyID, Type, Usage, CustomKeyIdentifier)

# Populate the array with the DisplayName, ObjectId, ApplicationId, KeyId, Type, StartDate and EndDate of each Certificates & secrets for each App Registration

$application | Get-AzADAppCredential -ErrorAction SilentlyContinue | Select-Object `

-Property @{Name='DisplayName'; Expression={$application.DisplayName}}, `

@{Name='ObjectId'; Expression={$application.ObjectId}}, `

@{Name='ApplicationId'; Expression={$application.ApplicationId}}, `

@{Name='KeyId'; Expression={$_.KeyId}}, `

@{Name='Type'; Expression={$_.Type}},`

@{Name='StartDate'; Expression={$_.StartDate -as [datetime]}},`

@{Name='EndDate'; Expression={$_.EndDate -as [datetime]}}

}

Here is a sample what the array would contain:

image

Adding additional fields to specify whether certificate & secret has expired

It is possible to immediately send the information already collected to the Log Analytics workspace but Chris Scott too it one step further and appended additional fields for whether the certificate or secret was expired, the timestamp used to check the validity and the days until expiry. This can be accomplished while using Kusto query but I find this to be very handy to add.

# With the $application array populated with the Certificates & secrets and its App Registration, proceed to calculate and add the fields to each record in the array:

# Expiration of the certificate or secret - Valid or Expired

# Add the timestamp used to calculate the validity

# The days until the certificate or secret expires

Write-output 'Validating expiration data...'

$timeStamp = Get-Date -format o

$today = (Get-Date).ToUniversalTime()

$appWithCredentials | Sort-Object EndDate | % {

# First if catches certificates & secrets that are expired

if($_.EndDate -lt $today) {

$days= ($_.EndDate-$Today).Days

$_ | Add-Member -MemberType NoteProperty -Name 'Status' -Value 'Expired'

$_ | Add-Member -MemberType NoteProperty -Name 'TimeStamp' -Value "$timestamp"

$_ | Add-Member -MemberType NoteProperty -Name 'DaysToExpiration' -Value $days

# Second if catches certificates & secrets that are still valid

} else {

$days= ($_.EndDate-$Today).Days

$_ | Add-Member -MemberType NoteProperty -Name 'Status' -Value 'Valid'

$_ | Add-Member -MemberType NoteProperty -Name 'TimeStamp' -Value "$timestamp"

$_ | Add-Member -MemberType NoteProperty -Name 'DaysToExpiration' -Value $days

}

}

Here is a sample of what the array will contain:

image

Converting data to be sent to Log Analytics to JSON

The HTTP Data Collector API expects the data to be in JSON format so the collected information is converted:

# Convert the list of each Certificates & secrets for each App Registration into JSON format so we can send it to Log Analytics

$appWithCredentialsJSON = $appWithCredentials | convertto-json

## The following commented lines is a sample JSON that can be used to test sending data to Log Analytics

<#

$json = @"

[{

"DisplayName": "Vulcan O365 Audit Logs",

"ObjectId": "058f1297-ba80-4b9e-8f9c-15febdf85df0",

"ApplicationId": {

"value": "ac28a30a-6e5f-4c2d-9384-17bbb0809d57",

"Guid": "ac28a30a-6e5f-4c2d-9384-17bbb0809d57"

},

"KeyId": "2ea30e24-e2ad-44ff-865a-df07199f26a5",

"Type": "AsymmetricX509Cert",

"StartDate": "2021-05-29T18:26:46",

"EndDate": "2022-05-29T18:46:46"

},

{

"DisplayName": "Vulcan O365 Audit Logs",

"ObjectId": "058f1297-ba80-4b9e-8f9c-15febdf85df0",

"ApplicationId": {

"value": "ac28a30a-6e5f-4c2d-9384-17bbb0809d57",

"Guid": "ac28a30a-6e5f-4c2d-9384-17bbb0809d57"

},

"KeyId": "259dbc4d-cdde-4007-a9ed-887437560b15",

"Type": "AsymmetricX509Cert",

"StartDate": "2021-05-29T17:46:22",

"EndDate": "2022-05-29T18:06:22"

}]

"@

#>

Use the Post-LogAnalyticsData function to send the collected data to the Log Analytics Workspace

With the data collected, proceed to use the Post-LogAnalyticsData function to send the data:

# Submit the data to the API endpoint

Post-LogAnalyticsData -customerId $customerId -sharedKey $sharedKey -body ([System.Text.Encoding]::UTF8.GetBytes($appWithCredentialsJSON)) -logType $logType

Here is a successful POST with a return code of 200:

image

More information about the return codes can be found here: https://docs.microsoft.com/en-us/azure/azure-monitor/logs/data-collector-api#return-codes

------------------------------------------------------------------------------------------------------------

Note that it will take a bit of time before the data is displayed in the Log Analytics Workspace. I’ve found that I sometimes have to wait upwards to 15 minutes or more before it is displayed.

When it is displayed, you should see the a table with the log name you specified in Custom Logs:

image

You will also see the table listed under Custom Logs:

image

Querying the log will display the following:

image

The fields available as you scroll across are:

  1. TimeGenerated [UTC]
  2. Computer
  3. RawData
  4. DisplayName_s
  5. ObjectId_g
  6. ApplicationId_value_g
  7. ApplicationId_Guid_g
  8. KeyId_g
  9. Type_s
  10. StartDate_t [UTC]
  11. EndDate_t [UTC]
  12. Status_s
  13. TimeStamp_t [UTC]
  14. DaysToExpiration_d
  15. Type
  16. _ResourceId
  17. TenantId
  18. SourceSystem
  19. MG
  20. ManagementGroupName

You might be wondering the following:

Question: Why are there _s, _g, _d appended to some of the variables?

Answer: These are record types (string, Boolean, double, date/time, GUID) that are automatically added.

image

See the following link for more information: https://docs.microsoft.com/en-us/azure/azure-monitor/logs/data-collector-api#record-type-and-properties

Question: Why are there extra fields?

Answer: Some of them are reserved properties (e.g. tenant, TimeGenerated, RawData) and others are other default fields added.

------------------------------------------------------------------------------------------------------------

I hope this post provides a bit more information about how to send custom log data to a Log Analytics Workspace. The Microsoft documentation:

Send log data to Azure Monitor by using the HTTP Data Collector API (preview)
https://docs.microsoft.com/en-us/azure/azure-monitor/logs/data-collector-api

… does a fantastic job of explaining all the components in detail albeit a bit of a long read so I hope this blog post helps provide a shorten version of the mechanics.

I will be following up with another post that demonstrates how to automate the use of this script to collect App Registrations’ Certificates and Secrets expiration into Log Analytics, then use a Logic App to create and send a report out via email so stay tuned.

------------------------------------------------------------------------------------------------------------

# The following function builds the signature used to authorization header that sends a request to the Azure Monitor HTTP Data Collector API

Function Build-Signature ($customerId, $sharedKey, $date, $contentLength, $method, $contentType, $resource)

{

$xHeaders = "x-ms-date:" + $date

$stringToHash = $method + "`n" + $contentLength + "`n" + $contentType + "`n" + $xHeaders + "`n" + $resource

$bytesToHash = [Text.Encoding]::UTF8.GetBytes($stringToHash)

$keyBytes = [Convert]::FromBase64String($sharedKey)

$sha256 = New-Object System.Security.Cryptography.HMACSHA256

$sha256.Key = $keyBytes

$calculatedHash = $sha256.ComputeHash($bytesToHash)

$encodedHash = [Convert]::ToBase64String($calculatedHash)

$authorization = 'SharedKey {0}:{1}' -f $customerId,$encodedHash

return $authorization

}

# The following function will create and post the request using the signature created by the Build-Signature function for authorization

Function Post-LogAnalyticsData($customerId, $sharedKey, $body, $logType)

{

$method = "POST"

$contentType = "application/json"

$resource = "/api/logs"

$rfc1123date = [DateTime]::UtcNow.ToString("r")

$contentLength = $body.Length

$signature = Build-Signature `

-customerId $customerId `

-sharedKey $sharedKey `

-date $rfc1123date `

-contentLength $contentLength `

-method $method `

-contentType $contentType `

-resource $resource

$uri = "https://" + $customerId + ".ods.opinsights.azure.com" + $resource + "?api-version=2016-04-01"

$headers = @{

"Authorization" = $signature;

"Log-Type" = $logType;

"x-ms-date" = $rfc1123date;

"time-generated-field" = $TimeStampField;

}

$response = Invoke-WebRequest -Uri $uri -Method $method -ContentType $contentType -Headers $headers -Body $body -UseBasicParsing

return $response.StatusCode

}

# Log in interactively with an account with Global Reader role

Connect-AzAccount

# Replace with your Workspace ID

$customerId = "b0d472a3-8c13-4cec-8abb-76051843545f"

# Replace with your Workspace Primary Key

$sharedKey = "D3s71+X0M+Q3cGTHC5I6H6l23xRNAKvjA+yb8JzMQQd3ntxeFZLmMWIMm7Ih/LPMOji9zkXDwavAJLX1xEe/4g=="

# Specify the name of the record type that you'll be creating (this is what will be displayed under Log Analytics > Logs > Custom Logs)

$LogType = "AppRegistrationExpiration"

# Optional name of a field that includes the timestamp for the data. If the time field is not specified, Azure Monitor assumes the time is the message ingestion time

$TimeStampField = ""

# Get the full list of Azure AD App Registrations

$applications = Get-AzADApplication

# Get the full list of Azure AD Enterprise Applications (Service Principals)

$servicePrincipals = Get-AzADServicePrincipal

# Create an array named appWithCredentials

$appWithCredentials = @()

# Populate the array with app registrations that have credentials

# Retrieve the list of applications and sort them by DisplayName

$appWithCredentials += $applications | Sort-Object -Property DisplayName | % {

# Assign the variable application with the follow list of properties

$application = $_

# Retrieve the list of Enterprise Applications (Service Principals) and match the ApplicationID of the SP to the App Registration

$sp = $servicePrincipals | ? ApplicationId -eq $application.ApplicationId

Write-Verbose ('Fetching information for application {0}' -f $application.DisplayName)

# Use the Get-AzADAppCredential cmdlet to get the Certificates & secrets configured (this returns StartDate, EndDate, KeyID, Type, Usage, CustomKeyIdentifier)

# Populate the array with the DisplayName, ObjectId, ApplicationId, KeyId, Type, StartDate and EndDate of each Certificates & secrets for each App Registration

$application | Get-AzADAppCredential -ErrorAction SilentlyContinue | Select-Object `

-Property @{Name='DisplayName'; Expression={$application.DisplayName}}, `

@{Name='ObjectId'; Expression={$application.ObjectId}}, `

@{Name='ApplicationId'; Expression={$application.ApplicationId}}, `

@{Name='KeyId'; Expression={$_.KeyId}}, `

@{Name='Type'; Expression={$_.Type}},`

@{Name='StartDate'; Expression={$_.StartDate -as [datetime]}},`

@{Name='EndDate'; Expression={$_.EndDate -as [datetime]}}

}

# With the $application array populated with the Certificates & secrets and its App Registration, proceed to calculate and add the fields to each record in the array:

# Expiration of the certificate or secret - Valid or Expired

# Add the timestamp used to calculate the validity

# The days until the certificate or secret expires

Write-output 'Validating expiration data...'

$timeStamp = Get-Date -format o

$today = (Get-Date).ToUniversalTime()

$appWithCredentials | Sort-Object EndDate | % {

# First if catches certificates & secrets that are expired

if($_.EndDate -lt $today) {

$days= ($_.EndDate-$Today).Days

$_ | Add-Member -MemberType NoteProperty -Name 'Status' -Value 'Expired'

$_ | Add-Member -MemberType NoteProperty -Name 'TimeStamp' -Value "$timestamp"

$_ | Add-Member -MemberType NoteProperty -Name 'DaysToExpiration' -Value $days

# Second if catches certificates & secrets that are still valid

} else {

$days= ($_.EndDate-$Today).Days

$_ | Add-Member -MemberType NoteProperty -Name 'Status' -Value 'Valid'

$_ | Add-Member -MemberType NoteProperty -Name 'TimeStamp' -Value "$timestamp"

$_ | Add-Member -MemberType NoteProperty -Name 'DaysToExpiration' -Value $days

}

}

# Convert the list of each Certificates & secrets for each App Registration into JSON format so we can send it to Log Analytics

$appWithCredentialsJSON = $appWithCredentials | convertto-json

## The following commented lines is a sample JSON that can be used to test sending data to Log Analytics

<#

$json = @"

[{

"DisplayName": "Vulcan O365 Audit Logs",

"ObjectId": "058f1297-ba80-4b9e-8f9c-15febdf85df0",

"ApplicationId": {

"value": "ac28a30a-6e5f-4c2d-9384-17bbb0809d57",

"Guid": "ac28a30a-6e5f-4c2d-9384-17bbb0809d57"

},

"KeyId": "2ea30e24-e2ad-44ff-865a-df07199f26a5",

"Type": "AsymmetricX509Cert",

"StartDate": "2021-05-29T18:26:46",

"EndDate": "2022-05-29T18:46:46"

},

{

"DisplayName": "Vulcan O365 Audit Logs",

"ObjectId": "058f1297-ba80-4b9e-8f9c-15febdf85df0",

"ApplicationId": {

"value": "ac28a30a-6e5f-4c2d-9384-17bbb0809d57",

"Guid": "ac28a30a-6e5f-4c2d-9384-17bbb0809d57"

},

"KeyId": "259dbc4d-cdde-4007-a9ed-887437560b15",

"Type": "AsymmetricX509Cert",

"StartDate": "2021-05-29T17:46:22",

"EndDate": "2022-05-29T18:06:22"

}]

"@

#>

# Submit the data to the API endpoint

Post-LogAnalyticsData -customerId $customerId -sharedKey $sharedKey -body ([System.Text.Encoding]::UTF8.GetBytes($appWithCredentialsJSON)) -logType $logType

Monday, March 21, 2022

Configuring Azure Sentinel to capture and monitor Azure AD logs

As a follow up to my previous post:

Monitoring, Alerting, Reporting Azure AD logins and login failures with Log Analytics and Logic Apps
http://terenceluk.blogspot.com/2022/02/monitoring-alerting-reporting-azure-ad.html

If your organization already uses Azure Sentinel as a (SIEM Security information and event management), it would be preferred to use the Azure Active Directory Data Connector available in Sentinel to capture and monitor Azure AD logs. This post serves to demonstrate how to achieve the same results by leveraging Azure Sentinel to capture and query the Azure AD events.

Create Log Analytics Workspace

Begin by creating a Log Analytics Workspace that will store the Azure Sentinel streamed Active Directory Domain Controller logs:

image

Add Microsoft Sentinel to the new Log Analytics Workspace

With the Domain Controllers added to the Log Analytics Workspace, proceed to create and add a new Microsoft Sentinel with the Log Analytics Workspace:

image

Configure Azure Sentinel Data Connector to collect Windows Security events

Navigate to Data Connectors, type in Azure Active Directory in the filter text field, select Azure Active Directory then click on Open connector page:

image

The type of Azure AD logs that Microsoft Sentinel can capture are listed in the connector page. Note that in order to export Sign-in data, your organization needs Azure AD P1 or P2 license.

image

Detailed information about the options can be found here:

Connect Azure Active Directory (Azure AD) data to Microsoft Sentinel
https://docs.microsoft.com/en-us/azure/sentinel/connect-azure-active-directory

You can use Microsoft Sentinel's built-in connector to collect data from Azure Active Directory and stream it into Microsoft Sentinel. The connector allows you to stream the following log types:

  • Sign-in logs, which contain information about interactive user sign-ins where a user provides an authentication factor.

The Azure AD connector now includes the following three additional categories of sign-in logs, all currently in PREVIEW:

  • Audit logs, which contain information about system activity relating to user and group management, managed applications, and directory activities.
  • Provisioning logs (also in PREVIEW), which contain system activity information about users, groups, and roles provisioned by the Azure AD provisioning service.

The options:

  • ADFS Sign-In Logs (Preview)
  • User Risk Events (Preview)
  • Risk Users (Preview)

… are new sources that Microsoft has released after the initial options.

ADFS Sign-In Logs requires Azure AD Connect Health to correlate Event IDs from AD FS to provide information about the request and error details if the request fails. More information about ADFS Sign-In Logs can be found here:

AD FS sign-ins in Azure AD with Connect Health – preview
https://docs.microsoft.com/en-us/azure/active-directory/hybrid/how-to-connect-health-ad-fs-sign-in

What is Azure AD Connect Health?
https://docs.microsoft.com/en-us/azure/active-directory/hybrid/whatis-azure-ad-connect#what-is-azure-ad-connect-health

Download Azure AD Connect Health Agent for AD FS
https://portal.azure.com/#blade/Microsoft_Azure_ADHybridHealth/AadHealthMenuBlade/QuickStart

User Risk Events (Preview) and Risk Users (Preview) pertain to Azure Identity Protection, which are made available with the Azure Premium P2 licenses. More information about the details of these events can be found here: https://docs.microsoft.com/en-us/azure/active-directory/identity-protection/concept-identity-protection-risks and more information about the type of risk data that can be queried can be found here: https://docs.microsoft.com/en-gb/azure/active-directory/identity-protection/howto-export-risk-data

image

For the purpose of this example, we’ll select

  • Sign-in logs
  • Audit logs
  • Non-interactive user sign-in logs
  • Service principal sign-in logs
  • Managed Identity sign-in logs
  • Provisioning logs
image

Note that you may need to wait upwards to 30 minutes before the connector’s status switches from Not connected to Connected and the Data types begin to be highlighted in green:

image

image

Once connected, you should see the following tables under Log Management:

  • SigninLogs
  • AuditLogs
  • AADNonInteractiveUserSignInLogs
  • AADServicePrincipalSignInLogs
  • AADManagedIdentitySignInLogs
  • AADProvisioningLogs

From here, the sky is really the limit as we have access to various logs and able to query for any type of information we want with Kusto. One of the examples I demonstrated in my previous post is the following which would look for failed sign-ins from within the SigninLogs:

SigninLogs

| where Status.errorCode != 0

| extend City=LocationDetails.city, State=LocationDetails.state, Country=LocationDetails.countryOrRegion, Error_Code=Status.errorCode, Failure_Reason=Status.failureReason

| project TimeGenerated, UserDisplayName, AppDisplayName, IPAddress, City, State, Country, AuthenticationRequirement, Failure_Reason, ConditionalAccessStatus, ConditionalAccessPolicies, Error_Code

image

With Azure Sentinel configured for the Azure AD logs, we can configure Logic Apps to send out daily reports or other automation tasks for alerting. Please see my previous post for configuration demonstration.

Monitoring, Alerting, Reporting Azure AD logins and login failures with Log Analytics and Logic Apps
http://terenceluk.blogspot.com/2022/02/monitoring-alerting-reporting-azure-ad.html