Pages

Tuesday, July 25, 2023

Creating Azure Route Tables, UDRs, and IPGroups with PowerShell and Excel reference files

I recently worked with a colleague to complete a deployment and one of the laborious activities we had to complete were:

  1. Create Route Tables with UDRs (user defined routes)
  2. IP Groups

There are a significant amount of entries for both resources and while it was possible to create these manually in the portal, I felt that it was better to create a PowerShell script to accelerate the creation and minimize human typo and copy and paste errors. The 2 scripts I created for this are as follows.

Creating Route Tables and UDRs

The PowerShell script I created, which can be found here in my Github repo: https://github.com/terenceluk/Azure/blob/main/PowerShell/Create-Route-Tables-and-UDRs.ps1, will read an Excel file and create the route tables and the corresponding UDRs (all route tables should have the same UDRs). One of the conditions I’ve added in is an IF statement that checks to see if the UDR to be added is the same subnet as where the route table will be attached. If it is the same, then the script will skip the creation of the UDR additional so we don’t end up routing traffic from the same subnet up to the firewall. The naming convention designed allows me to compare the Route Table and UDR name to determine if it is a match but if your environment is different then you’ll need to adjust the check. Here are screenshots of the sample spreadsheet that is read:

imageimage

Create IPGroups

There were many IP Groups that needed to be created as well because the environment had an IP Group for each subnet. The script that will read an Excel file and create the list of IP Groups can be found here at my GitHub repo: https://github.com/terenceluk/Azure/blob/main/PowerShell/Create-IP-Groups.ps1

Here are sample screenshots of the Excel file:

image

Tuesday, July 11, 2023

Sysprep fails due to Notepad++ when preparing virtual machine for image capture to deploy Azure Virtual Desktop

One of the common issues I’ve continuously come across while preparing Windows 10 and Windows 2016 above operating systems for virtual desktops or remote desktop services is when sysprep fails due to an installed application linked to a user account. I ran encountered this issue again last month with the application Notepad++ when preparing a Windows 11 Enterprise Multi-Session virtual machine for an Azure Virtual Desktop deployment. There are plenty of different PowerShell cmdlets that can be run in an attempt to fix the issue, but I find some of them result with rendering the virtual machine in a state that I would no longer be confident in deploying so I wanted to some of the steps I use for personal reference and to help anyone who may encounter a similar issue.

Problem

You attempt to run sysprep after finishing the preparation of a master image:

image

Sysprep immediately fails with the error:

Sysprep was not able to validate your Windows installation.
Review the log file at
%WINDIR%\System32\Sysprep\Panther\setupact.log for
details. After resolving the issue, use Sysprep to validate your installation again.

image

Opening the setupact.log will reveal the following line:

Error                 SYSPRP Package NotepadPlusPlus_1.0.0.0_neutral__7njy0v32s6xk6 was installed for a user, but not provisioned for all users. This package will not function properly in the sysprep image.

image

Opening the setuperr.log will reveal the following lines:

2023-05-05 07:20:10, Error                 SYSPRP BCD: BiUpdateEfiEntry failed c000000d

2023-05-05 07:20:10, Error                 SYSPRP BCD: BiExportBcdObjects failed c000000d

2023-05-05 07:20:10, Error                 SYSPRP BCD: BiExportStoreAlterationsToEfi failed c000000d

2023-05-05 07:20:10, Error                 SYSPRP BCD: Failed to export alterations to firmware. Status: c000000d

2023-07-07 19:51:46, Error                 SYSPRP Package NotepadPlusPlus_1.0.0.0_neutral__7njy0v32s6xk6 was installed for a user, but not provisioned for all users. This package will not function properly in the sysprep image.

2023-07-07 19:51:46, Error                 SYSPRP Failed to remove apps for the current user: 0x80073cf2.

2023-07-07 19:51:46, Error                 SYSPRP Exit code of RemoveAllApps thread was 0x3cf2.

2023-07-07 19:51:46, Error                 SYSPRP ActionPlatform::LaunchModule: Failure occurred while executing 'SysprepGeneralizeValidate' from C:\Windows\System32\AppxSysprep.dll; dwRet = 0x3cf2

2023-07-07 19:51:46, Error                 SYSPRP SysprepSession::Validate: Error in validating actions from C:\Windows\System32\Sysprep\ActionFiles\Generalize.xml; dwRet = 0x3cf2

2023-07-07 19:51:46, Error                 SYSPRP RunPlatformActions:Failed while validating Sysprep session actions; dwRet = 0x3cf2

2023-07-07 19:51:46, Error      [0x0f0070] SYSPRP RunDlls:An error occurred while running registry sysprep DLLs, halting sysprep execution. dwRet = 0x3cf2

2023-07-07 19:51:46, Error      [0x0f00d8] SYSPRP WinMain:Hit failure while pre-validate sysprep generalize internal providers; hr = 0x80073cf2

image

Proceeding to uninstall Notepad++ from the image will allow sysprep to run and complete successfully but this means the deployed virtual desktops would need the application installed manually.

Solution

The first step to take for resolving this issue is to restore the virtual machine from a snapshot that had not failed on a sysprep because the sysprep process removes packages from the operating system and there will be times when:

  1. After fixing the Notepad++ application, sysprep would fail and error out on other native Microsoft applications
  2. You would notice that Notepad is no longer available on the virtual machine
  3. Other odd errors would occur

It is better to troubleshoot and perform sysprep on a machine that has no experienced a half executed but failed sysprep.

Once a fresh snapshot is restored, we can now work on determining which accounts Notepad++ is linked to. This can reviewed by starting PowerShell and executing the following cmdlet:

Get-AppxPackage -AllUser | Format-List -Property PackageFullName,PackageUserInformation

The cmdlet above will list all packages installed and this example coincidentally places the Notepad++ package at the end of the output:

image

If the package in question starts with a letter earlier than M (for Microsoft) and results in being nested within the long output, we can use the following cmdlet to filter the PackageFullName to what is being searched for:

Get-AppxPackage -AllUser | Where-Object {$_.PackageFullName -like "NotepadPlusPlus*"} | Format-List -Property PackageFullName,PackageUserInformation

With the package located identify which accounts are listed to have the application installed. The screenshot above only lists one account but if there are more, the easiest approach is to delete all the accounts and their profiles. If there is only one account listed and it is the built-in administrator account, you won’t be able to delete it because the following error will be displayed when you try to do so:

The following error occurred while attempting to delete the user admin:

Cannot perform this operation on built-in accounts.

image

To get around this, log in as the account with Notepad++ install linked to, launch PowerShell and execute the following cmdlet:

Remove-AppxPackage -Package <packagefullname>

The following is the cmdlet that is used to remove Notepad++ from the account:

Remove-AppxPackage -Package NotepadPlusPlus_1.0.0.0_neutral__7njy0v32s6xk6

image

You should no longer find the Notepad++ when executing the following cmdlet:

Get-AppxPackage -AllUser | Where-Object {$_.PackageFullName -like "NotepadPlusPlus*"} | Format-List -Property PackageFullName,PackageUserInformation

image

Running sysprep should now complete so the virtual machine can be captured to an image for session host deployments.

Thursday, July 6, 2023

Converting Azure Firewall logs in JSON format created from Archive to a storage account diagnostic setting to CSV format

One of the clients I recently worked with had a requirement that all traffic traversing through the Azure Firewall need to be stored for at least 6 months due to auditing requirements. Accomplishing this wasn’t difficult because it was a matter of either increasing the retention for the Log Analytics Workspace or sending the log files to a storage account for archiving. Given the long period of 6 months, I opted to set the Log Analytics workspace retention to 3 months and provide the remaining retention by sending the logs to a storage account:

image image

The Firewall logs that are sent to the storage account will be stored in a container named insights-logs-azurefirewall:

image

Navigating into this container will show that it a folder tree consisting of multiple subfolders containing the subscription, the name of the resource group containing the firewall, which also contains the VNet because it is a requirement to store the firewall resource in the same RG as the VNet:

insights-logs-azurefirewall / resourceId= / SUBSCRIPTIONS / CCE9BD62-xxxx-xxxx-xxxx-xxxx51CE27DA / RESOURCEGROUPS / RG-CA-C-VNET-PROD / PROVIDERS / MICROSOFT.NETWORK / AZUREFIREWALLS / AFW-CA-C-PROD

It then splits to the logs into subfolders with:

  • Year
  • Month
  • Day
  • Hour
  • Minute (only 1 folder labeled as 00)
image

Drilling all the way down to the minute folder will contain a PT1H.json file that is the Append blob type. This is the file that will contain the firewall traffic log entries:

image

While browsing through the content of the PT1H.json file, I immediately noticed that the format of the entries did not appear to conform to any of the JSON Specifications (RFC 4627, 7159, 8259) because while I’m not very familiar with JSON format, I can see that:

  1. The beginning entries for the whole JSON file is missing an open square bracket and the end of the file is missing a close square bracket <- Line 1
  2. The nested properties values do not have an open square bracket before the brace and a close square bracket at the end of the close brace <- Line 5 and Line 22
  3. The close brace for each entry does not have a comma that separates each log <- Line 24
image

Trying to paste this into a JSON validator (https://jsonformatter.curiousconcept.com/) would show it does not conform to any RFC format:

image

Reviewing the Microsoft confirms that the format of blobs in a Storage Account is in JSON lines, where each record is delimited by a new line, with no outer records array and no commas between JSON records: https://learn.microsoft.com/en-us/azure/azure-monitor/logs/logs-data-export?tabs=portal#storage-account

Further reading shows that this was put in place since November 1st, 2018:

Prepare for format change to Azure Monitor platform logs archived to a storage account
https://learn.microsoft.com/en-us/previous-versions/azure/azure-monitor/essentials/resource-logs-blob-format

More reading about the JSON Lines format can be found here: https://jsonlines.org/

My objective was to simply use a PowerShell script to convert a JSON file into a CSV so it can be sent to the client for review but my script would not work with the JSON Line format. Fixing this manually by hand if there were 2 records wouldn’t be difficult, but these firewall logs have thousands of entries and I needed a way to automate the conversion. The whole process of getting this to work too quite a bit of my time so I wanted to write this blob post to help anyone who may come across the same challenge.

Step #1 – Fixing the poorly formatted JSON file

The first step was to fix the JSON Line formatted JSON file so it conforms to an RFC 8259 format. What this basically meant is addressing these 3 items:

  1. The beginning entries for the whole JSON file is missing an open square bracket and the end of the file is missing a close square bracket <- Line 1
  2. The nested properties values does not have an open square bracket before the brace and a close square bracket at the end of the close brace <- Line 5 and Line 22
  3. The close brace for each entry does not have a comma that separates each log <- Line 24

I’ve reduced the JSON file to only 2 log entries to show all the changes required:

  1. Add an open [ bracket after properties":
  2. Add a close ] bracket at the end of properties }
  3. Add a comma after close } brace for each log entry but exclude last entry
  4. Add a bracket at the beginning of the JSON
  5. Add a bracket at the end of the JSON
image

The best way to approach this is to use Regex expressions to match the desired block or blocks of lines and add the desired brackets and/or comma. My days of using Regex goes back to when I worked on voice deployments for OCS 2007, Lync Server, Skype for Business Server, and Teams Direct Routing. My role over the past few years does not include this product so if you (the reader) see a better way of writing these expressions, please feel free to provide suggestions in the comments.

Add an open [ bracket after properties": and add a close ] bracket at the end of properties }

The Regex expression to match all the contents in the nested properties block is:

(?<=”properties”: )([\s\S]*?})

This can be validated on a Regex validator such as: https://regexr.com/

image

We can use the following PowerShell regex replace function to add the missing open and close square brackets:

$fixedJson = [regex]::Replace($badJson, $regexPattern, { param($match) "[{0}]" -f $match.Value })

Add a comma after close } brace for each log entry but exclude last entry

With the missing open and square brackets added, we can use the output and the following regex expression to match all of the log entries to add a comma for separation AND NOT include the last log at the end of the entries:

(?="category": )([\s\S]*?}]\s}\W)

image

Note that the last block for the log entry is excluded:

image

--------------------------------- Update August 21-2023 ---------------------------------------

I realized that the previous RegEx expression I used would fail to match scenarios where there are spaces or line breaks between the square and curly brackets so I’ve updated the expression for the script on GitHub and adding the changes here.

(?="category": )([\s\S]*?}[\s\S]*?][\s\S]*?}\W)

The following is a break down of each section of the RegEx expression:

(?="category": ) <-- Match the first block before spanning the text after this

([\s\S]*?}[\s\S]*?][\s\S]*?}\W) <-- This is to match everything from the category and to the end

([\s\S]*?} <-- Match everything that is a whitespace and not whitespace, words, digits and end at the curly bracket }

[\s\S]*?] <-- Match everything that is a whitespace and not whitespace, words, digits and end at the square bracket ]

[\s\S]*?} <-- Continue matching everything that is a whitespace and not whitespace, words, digits and end at the next curly bracket }

\W) <-- This excludes the last block

----------------------------------------------------------------------------------------------------------------

We can use the following PowerShell regex replace function add the missing comma between entries:

$fixedJson = [regex]::Replace($badJson, $regexPattern, { param($match) "{0}," -f $match.Value })

Add a bracket at the beginning of the JSON and add a bracket at the end of the JSON

With the comma added between each log, we can now proceed to add the open and close square bracket to the beginning and end of the file with the following regex expression:

^([^$]+)

image

We can use the following PowerShell regex replace function add the missing open and close square bracket to the beginning and end:

$fixedJson = [regex]::Replace($badJson, $regexPattern, { param($match) "[{0}]" -f $match.Value })

With the missing formatting added, we should now be able to validate the JSON file:

image

Step #2 – Create a PowerShell script that will read the Azure Firewall Storage Account JSON and convert to CSV

With the Regex expressions defined and missing brackets and braces defined, the next step is to write a PowerShell script that will read the native JSON file, format the JSON so it is RFC 8259 compliant, parse through each entry and place the log entry details into the rows and columns of the CSV file.

The script can be found in my following GitHub: https://github.com/terenceluk/Azure/blob/main/Azure%20Firewall/Convert-JSON-Logs-to-CSV.ps1

The components of the script are as follows:

1. The first portion where we use Regex to fix the JSON formatting

image

2. Begin parsing the formatted JSON file:

**Update the following 2 variables:

  1. $pathToJsonFile = "PT1H2.json"
  2. $pathToOutputFile = "PT1H2.csv"
image

When writing the portion of the code used for parsing the JSON file, I noticed that there wasn’t an easy way for me to automatically read through the column headings to avoid defining them directly because there are different type of records in the JSON file with varying headings. This meant in order to transfer all the records into a CSV, I would need to define all of the headings upfront. Since not all the headings will be used for every record, any entries that does not have the heading will have that cell blank.

The end result of the export will look something like the following CSV:

imageimageimage

The diagnostics settings I selected for this example included the Legacy Azure Diagnostics category so the logs will have some redundant records where the legacy entries have the details in the Msg column, while the newer category will have the record details split into their own columns.

I hope this blog post helps anyone who may be looking for a way to parse and create a CSV file from the Azure Firewall log JSON files. I’ll be writing a follow up post in the future to demonstrate using a script to read the folders in the storage account so this doesn’t have to be done manually with every JSON file at every hour of the day.

Tuesday, July 4, 2023

Creating a Logic App that retrieves AAD sign-in events from Log Analytics and sends a report in an email with a CSV attachment and HTML table insert

Two of the common questions I’ve been asked since publishing the following post over a year ago:

Monitoring, Alerting, Reporting Azure AD logins and login failures with Log Analytics and Logic Apps

http://terenceluk.blogspot.com/2022/02/monitoring-alerting-reporting-azure-ad.html

… is whether there was a way to:

  1. Provide the report as a CSV attachment
  2. Pretty up the table that is inserted into the email

Providing the report as a CSV attachment is fairly easy but making the Html table more aesthetically pleasing wasn’t. After trying a few methods and not being very successful, I ended up landing on using an Azure Function App that takes the report in JSON format, create the HTML formatted table with colour output, then return it back to the Logic App. The method isn’t very efficient but provides the desired result so this post serves to demonstrate the configuration.

The screenshot below, shows two reports and emails sent out in the Logic App flow. The first Run query and visualize results and Send an email (V2) that is highlighted in red is what my previous post demonstrated, and it sends out an email that contain a plainly formatted HTML table in an email. The second Run query and list results, Create blob (V2), Convert JSON to HTML, Delete blob (V2), Initialize Variable, Set Variable, Create CSV table, Send an email (V2) 2 that I highlighted in green are the additional steps to create a CSV file with the report and send an email with a coloured HTML table:

image

Step #1 - Create Storage Account

While it is possible to send the full JSON directly to a Function App’s HTTP Trigger, logs exceeding the maximum size would fail so I opted to first create a JSON file and temporarily place it onto a Storage Account container so it can be retrieved by a Function App for processing. The storage of the JSON can be permanent as well but most environments I work with typically sends AAD logs to a storage account for audit retention so this design will only have the file stored for processing then deleted after.

Begin by creating a Storage Account and a container that will store the JSON file. For the purpose of this example, the container will be named: integration

image

The Function App that will be created will temporarily place a file similar to the one shown in this screenshot:

image

Due to the sensitivity of data, we want to ensure that the container is not publicly assessable so the Public access level should be configured as Private (no anonymous access):

image

For improved security, I always prefer to disable Allow storage account key access (Shared Key authorization) and use Azure Active Directory (Azure AD) for authorization. The method in which the Function App will securely access the Storage Account container is through a managed identity maintained by AAD so unless there is a need to allow shared key authorization, we can go ahead and disable it:

image

You’ll notice that browsing into the container through the will now require the Authentication method to be configured as Azure AD User Account:

image

Step #2 - Create Azure Function App

With the storage account created, we can proceed to create the Azure Function App that will be triggered via HTTP with the URL of the JSON file passed to it.

image

Create a new function of the type HTTP Trigger:

image

Open the function, navigate to Code + Test and paste the code from my GitHub repo into the function: https://github.com/terenceluk/Azure/blob/main/Function%20App/JSON-To-HTML-Function.ps1

image

Notable items in the code are the following:

  1. The container name is extracted from the full path to the JSON file with Regex
  2. The blob and storage account name are extracted from the full path to the JSON file with substring and indexOf method
  3. The function app expects the full URL path to be passed as a JSON like the following:

{

"body": "https://rgcacinfratemp.blob.core.windows.net/integration/AD-Report-06-29-2023.json"

}

Another way for defining the storage account and container name for the function app is in the Application settings but this hardcodes the value and requires updating:

image

The function app uses two Az modules to authenticate as a managed identity and retrieve the JSON file. Rather than loading the full Az module, which I have never had any luck because the amount of time it requires to be downloaded causes my function apps to time out, we will only load Az.Accounts and Az.Storage. Proceed to navigate to the App files blade, open the requirements.psd1 and edit the file as such:

# This file enables modules to be automatically managed by the Functions service.

# See https://aka.ms/functionsmanageddependency for additional information.

#

@{

# For latest supported version, go to 'https://www.powershellgallery.com/packages/Az'.

# To use the Az module in your function app, please uncomment the line below.

# 'Az' = '10.*'

'Az.Accounts' = '2.*'

'Az.Storage' = '4.*'

}

image

I’ve ran into scenarios where the modules do not get downloaded or loaded properly and the way I typically troubleshoot the issue is to navigate into Kudo for the function app to check the downloaded or not downloaded modules via the URL:

https://json-to-html-converter.scm.azurewebsites.net/

imageimage

Once the function app code has been saved and configuration updated, proceed to navigate to the Identity blade and turn on system managed identity:

image

Step #3 - Create Logic App

One of the key differences between the plain table report and the new report is that the old one uses Run query and visualize results to query Log Analytics or the report details, while the new report uses Run query and list results to query Log Analytics for the data. The Run query and visualize results action provides output options:

  • Html Table
  • Pie Chart
  • Time Chart
  • Bar Chart
image

In order to generate an output that will allow us to create a customized Html table and CSV file, we would need to use Run query and list results action that generates a JSON file. This JSON file will allows us to create a blob on a storage account container that will be used to generate a customized Html table, as well as create a CSV file:

image

We want to create the JSON file with a meaningful name so we’ll be using the concat function to name the file:

concat(‘AD-Report-‘,formatDateTime(utcNow(), ‘MM-dd-yyy’),’.json’)

This expression will generate a file with the name AD-Report-<today’s date>.json

The blob content will be provided by the results from the Run query and list results action.

image

Once the JSON file with the AAD logs is created and placed into a storage account, the Logic App will call an Azure Function App and pass the full URL path so the Function App can retrieve the retrieve the JSON file, format the data into a Html table, then return it to the Logic App. Upon receiving the Html formatted results, the Logic App will then delete the log file. The remaining 2 steps after obtaining the properly formatted Html code is to create and set a variable so it can be used to send the logs as a table.

image

With the Html email reported ready, we will then use the Create CSV table action to create a CSV file from the Run Query and List Results action and send the email:

image

The following is a screenshot of how the email is composed with the EmailBody variable containing the HTML content, attaching the CSV table as an attachment and provide it the same name format:

image

Once the Logic App has been saved, proceed to navigate to the Identity blade and turn on system managed identity:

image

Step 4 – Assign managed identity for the Function App and Logic App permissions to the Storage Account

The last step is to grant the managed identities the appropriate permissions to the storage account.

The Azure Function App will only need Storage Blob Data Reader because it will only need to retrieve the JSON file.

The Logic App will need Storage Blob Data Contributor because it will need to write the JSON file to the storage account and then delete it afterwards.

image

Step 5 – Test Report

Proceed to run the Logic App and the following report should arrive in the configured mailbox:

image

Note that the CSS nth-child selector for even and odd rows does not work with Outlook so while the Html generated would display alternating blues for rows as shown in the screenshot below, the report sent in Outlook would not be the same.

image

Troubleshooting

The following PowerShell script can call the Function App directly if the Logic App does not generate the report and you want to troubleshoot by calling the Function App directly.

GitHub: https://github.com/terenceluk/Azure/blob/main/Function%20App/Test-Calling-API.ps1$Body = @{

path = https://storageAccountName.blob.core.windows.net/integration/AD-Report-06-28-2023.json

}

$Parameters = @{

Method = "POST"

Uri = "https://youFunctionName.azurewebsites.net/api/Converter?code=xxxxxxxxxxxxm_Dnc_avHxxxxxxxxxxxxxxDH1A=="

Body = $Body | ConvertTo-Json

ContentType = "application/json"

}

Invoke-RestMethod @Parameters | Out-File "C:\Temp\Call-API.html"

The Function App URI can be located in the field shown in the screenshot below:

image