Pages

Showing posts with label Function App. Show all posts
Showing posts with label Function App. Show all posts

Thursday, September 28, 2023

Using an Azure Function App to automatically Start and Stop (Deallocate) Virtual Machines based on tags

One of the most common questions I’ve been asked in the past when it comes to cost management in Azure is what are the options for powering off and on virtual machines based on a schedule. A quick Google search on the internet for this would return a mix of Azure Automation Accounts, the Auto-Shutdown feature blade within the VM (only powers off but not power on), Logic Apps, the new Automation Tasks, and Azure Functions. Each of these options have its advantages and disadvantages, and the associated cost to execute them. Few of them have limitations on the capabilities available for this type of automation. As much as I like Logic Apps because of it’s visual and little to no code capabilities, I find it a bit cumbersome to configure each step via a GUI and the flow of the Logic App can quickly become difficult to follow when there are multiple branches of conditions. My preference with most automation are Function Apps because it allows me to write code to perform anything I needed. With the above described, it’s probably not a surprise that this post is going to demonstrate this type of automation with an Azure Function App.

The scenario I want to provide is an ask from a client who wanted the following:

  1. Auto Start virtual machines at a certain time
  2. Auto Deallocate virtual machines at a certain time
  3. Capability to set start and deallocate schedules for weekdays and weekends
  4. Capability to indicate virtual machines should either be powered on or deallocated over the weekend (they had some workloads that did not need to be on during the week but had to be on over the weekend
  5. Lastly, and the most important, they wanted to use Tags to define the schedule because they use Azure Policies to enforce tagging

There are plenty of scripts available on the internet that provides most of the functionality but I could not find one that allowed the type of control over the weekend so I spent some time to write one.

Before I begin, I would like to explain that the script I wrote uses the Azure Resource Graph to query the status of the virtual machines, their resource groups, and their tags because I find the time it takes for ARM to interact with the resource providers can take a very long time as compared to interacting with the Resource Graph, which is much faster. Those who have used the Resource Graph Explorer in the Azure portal will recognize the KQL query I used to retrieve information. What’s great about this approach is that we can test the query directly in the portal:

imageimage

The design of the tagging for the virtual machines to control the power on, deallocate, and scheduling are as follows:

Tag

Value

Example

Purpose

Behavior

WD-AutoStart

Time in 24 hour format

08:00

Defines the start of the time when the VM should be powered on during the weekday

This condition is met if the time is equal or past the value for Monday to Friday

WD-AutoDeallocate

Time in 24 hour format

17:00

Defines the start of the time when the VM should be powered off during the weekday

This condition is met if the time is equal or past the value for Monday to Friday

WD-AutoStart

Time in 24 hour format

09:00

Defines the start of the time when the VM should be powered on during the weekend

This condition is met if the time is equal or past the value for Saturday and Sunday

WD-AutoDeallocate

Time in 24 hour format

15:00

Defines the start of the time when the VM should be powered off during the weekend

This condition is met if the time is equal or past the value for Saturday and Sunday

Weekend

On or Off

On

Defines whether the VM should be on or off over the weekend

This condition should be set if a weekday schedule is configured and the VM needs to be on as it is the condition to turn the VM back on after the power off on a Friday

The following is an example of a virtual machine with tags applied:

image

With the explanation out of the way, let’s get started with the configuration.

Step #1 – Create Function App

Begin by creating a Function App with the Runtime stack PowerShell Core version 7.2. The hosting option can either be consumption, premium, or App Service Plan but for this example, we’ll use consumption:

image

Proceed to configure the rest of the properties of the Function App:

imageimage

I always recommend turning on Application Insights whenever possible as it helps with debugging but it is not necessary:

image

You can integration the Function App with a GitHub account for CI/CD but for this example we won’t be enabling it:

image

Proceed to create the Function App:

image

Step #2 – Turn on System Assigned Managed Identity and Assign Permissions

To avoid managing certificates and secrets, and enhance the security posture of your Azure environment, it is recommended to use managed identities wherever possible so proceed to turn on the System assigned managed identity in the Identity blade of the Function App so an Enterprise Application object is created in Azure AD / Entra ID, which we can then use to assign permissions to the resources in the subscription:

image

You’ll see an Object (principal) ID created for the Function App after successfully turning on the System assigned identity:

image

Browsing to the Enterprise Applications in Entra ID will display the identity of the Function App:

imageimage

With the system managed identity created for the Function App, we can now proceed to grant it permissions to the resources it needs access to. This example will assign the managed identity as a Virtual Machine Contributor to the subscription so it can perform start and deallocate operations on all the virtual machines. Navigate to the subscription’s Access control (IAM) blade and click on Role assignments:

image

Proceed to select the Virtual Machine Contributor role:

image

Locate the Function App for the managed identity and save the permissions:

image

Step #3 – Configure the Function App

It’s currently September 27, 2023 as I write this post and I noticed that the Function App page layout and blades have changed. The Functions blade under Functions option no longer appears to exist so create the application by selecting Overview, under the Functions tab, click on Create in Azure Portal:

image

The type of function we’ll be creating will be the Timer Trigger and the Schedule will be configured as the following CRON expression:

0 0 * * * 0-6

The above CRON expression allows the function to run at every hour on every day, every month, Sunday to Saturday (every day, every hour).

image

Once the Function is created, proceed to click on Code + Test:

image

The code for this the Function can be copied from my GitHub repo at the following URL: https://github.com/terenceluk/Azure/blob/main/Function%20App/Start-Stop-VM-Function-Based-On-Tags.ps1

Make sure you update the subscriptions list and the timezone you want this script to use for the Tags:

image

Save the code and navigate back out to the Function App, select App Files, then select the requirements.psd1 in the heading to load the file. Note that the default template of this file has everything commented out. We can simply remove the hash character in front of 'Az' = '10.* ' to load all Az modules but I’ve had terrible luck in doing so as the process of downloading the files would cause the Function App to timeout. What I like to do is indicate exactly what modules I need and specify them.

image

The following are the modules my PowerShell script uses so proceed to copy and paste module requirements into the requirements.psd1 file:

'Az.Accounts' = '2.*'

'Az.Compute' = '2.*'

'Az.ResourceGraph' = '0.*'

image

Save the file and then switch to the host.json file:

image

As specified in the following Microsoft documentation: https://learn.microsoft.com/en-us/azure/azure-functions/functions-host-json#functiontimeout, , we can increase the default timeout of a consumption based Function App by adding the following attribute and value into the file:

{

"functionTimeout": "00:10:00"

}

Proceed to add the value to handle large environments that may cause the Function App to exceed the 5-minute limit:

image

Save the file and navigate back to the Function, Code + Test blade, and proceed to test the Function:

image

The execution should return a 202 Accepted HTTP response code and the virtual machines should now be powered on and off at the scheduled time:

image

I hope this blog post helps anyone who might be looking for a script that can handle weekend scheduling of VM start and stop operations.

Tuesday, July 4, 2023

Creating a Logic App that retrieves AAD sign-in events from Log Analytics and sends a report in an email with a CSV attachment and HTML table insert

Two of the common questions I’ve been asked since publishing the following post over a year ago:

Monitoring, Alerting, Reporting Azure AD logins and login failures with Log Analytics and Logic Apps

http://terenceluk.blogspot.com/2022/02/monitoring-alerting-reporting-azure-ad.html

… is whether there was a way to:

  1. Provide the report as a CSV attachment
  2. Pretty up the table that is inserted into the email

Providing the report as a CSV attachment is fairly easy but making the Html table more aesthetically pleasing wasn’t. After trying a few methods and not being very successful, I ended up landing on using an Azure Function App that takes the report in JSON format, create the HTML formatted table with colour output, then return it back to the Logic App. The method isn’t very efficient but provides the desired result so this post serves to demonstrate the configuration.

The screenshot below, shows two reports and emails sent out in the Logic App flow. The first Run query and visualize results and Send an email (V2) that is highlighted in red is what my previous post demonstrated, and it sends out an email that contain a plainly formatted HTML table in an email. The second Run query and list results, Create blob (V2), Convert JSON to HTML, Delete blob (V2), Initialize Variable, Set Variable, Create CSV table, Send an email (V2) 2 that I highlighted in green are the additional steps to create a CSV file with the report and send an email with a coloured HTML table:

image

Step #1 - Create Storage Account

While it is possible to send the full JSON directly to a Function App’s HTTP Trigger, logs exceeding the maximum size would fail so I opted to first create a JSON file and temporarily place it onto a Storage Account container so it can be retrieved by a Function App for processing. The storage of the JSON can be permanent as well but most environments I work with typically sends AAD logs to a storage account for audit retention so this design will only have the file stored for processing then deleted after.

Begin by creating a Storage Account and a container that will store the JSON file. For the purpose of this example, the container will be named: integration

image

The Function App that will be created will temporarily place a file similar to the one shown in this screenshot:

image

Due to the sensitivity of data, we want to ensure that the container is not publicly assessable so the Public access level should be configured as Private (no anonymous access):

image

For improved security, I always prefer to disable Allow storage account key access (Shared Key authorization) and use Azure Active Directory (Azure AD) for authorization. The method in which the Function App will securely access the Storage Account container is through a managed identity maintained by AAD so unless there is a need to allow shared key authorization, we can go ahead and disable it:

image

You’ll notice that browsing into the container through the will now require the Authentication method to be configured as Azure AD User Account:

image

Step #2 - Create Azure Function App

With the storage account created, we can proceed to create the Azure Function App that will be triggered via HTTP with the URL of the JSON file passed to it.

image

Create a new function of the type HTTP Trigger:

image

Open the function, navigate to Code + Test and paste the code from my GitHub repo into the function: https://github.com/terenceluk/Azure/blob/main/Function%20App/JSON-To-HTML-Function.ps1

image

Notable items in the code are the following:

  1. The container name is extracted from the full path to the JSON file with Regex
  2. The blob and storage account name are extracted from the full path to the JSON file with substring and indexOf method
  3. The function app expects the full URL path to be passed as a JSON like the following:

{

"body": "https://rgcacinfratemp.blob.core.windows.net/integration/AD-Report-06-29-2023.json"

}

Another way for defining the storage account and container name for the function app is in the Application settings but this hardcodes the value and requires updating:

image

The function app uses two Az modules to authenticate as a managed identity and retrieve the JSON file. Rather than loading the full Az module, which I have never had any luck because the amount of time it requires to be downloaded causes my function apps to time out, we will only load Az.Accounts and Az.Storage. Proceed to navigate to the App files blade, open the requirements.psd1 and edit the file as such:

# This file enables modules to be automatically managed by the Functions service.

# See https://aka.ms/functionsmanageddependency for additional information.

#

@{

# For latest supported version, go to 'https://www.powershellgallery.com/packages/Az'.

# To use the Az module in your function app, please uncomment the line below.

# 'Az' = '10.*'

'Az.Accounts' = '2.*'

'Az.Storage' = '4.*'

}

image

I’ve ran into scenarios where the modules do not get downloaded or loaded properly and the way I typically troubleshoot the issue is to navigate into Kudo for the function app to check the downloaded or not downloaded modules via the URL:

https://json-to-html-converter.scm.azurewebsites.net/

imageimage

Once the function app code has been saved and configuration updated, proceed to navigate to the Identity blade and turn on system managed identity:

image

Step #3 - Create Logic App

One of the key differences between the plain table report and the new report is that the old one uses Run query and visualize results to query Log Analytics or the report details, while the new report uses Run query and list results to query Log Analytics for the data. The Run query and visualize results action provides output options:

  • Html Table
  • Pie Chart
  • Time Chart
  • Bar Chart
image

In order to generate an output that will allow us to create a customized Html table and CSV file, we would need to use Run query and list results action that generates a JSON file. This JSON file will allows us to create a blob on a storage account container that will be used to generate a customized Html table, as well as create a CSV file:

image

We want to create the JSON file with a meaningful name so we’ll be using the concat function to name the file:

concat(‘AD-Report-‘,formatDateTime(utcNow(), ‘MM-dd-yyy’),’.json’)

This expression will generate a file with the name AD-Report-<today’s date>.json

The blob content will be provided by the results from the Run query and list results action.

image

Once the JSON file with the AAD logs is created and placed into a storage account, the Logic App will call an Azure Function App and pass the full URL path so the Function App can retrieve the retrieve the JSON file, format the data into a Html table, then return it to the Logic App. Upon receiving the Html formatted results, the Logic App will then delete the log file. The remaining 2 steps after obtaining the properly formatted Html code is to create and set a variable so it can be used to send the logs as a table.

image

With the Html email reported ready, we will then use the Create CSV table action to create a CSV file from the Run Query and List Results action and send the email:

image

The following is a screenshot of how the email is composed with the EmailBody variable containing the HTML content, attaching the CSV table as an attachment and provide it the same name format:

image

Once the Logic App has been saved, proceed to navigate to the Identity blade and turn on system managed identity:

image

Step 4 – Assign managed identity for the Function App and Logic App permissions to the Storage Account

The last step is to grant the managed identities the appropriate permissions to the storage account.

The Azure Function App will only need Storage Blob Data Reader because it will only need to retrieve the JSON file.

The Logic App will need Storage Blob Data Contributor because it will need to write the JSON file to the storage account and then delete it afterwards.

image

Step 5 – Test Report

Proceed to run the Logic App and the following report should arrive in the configured mailbox:

image

Note that the CSS nth-child selector for even and odd rows does not work with Outlook so while the Html generated would display alternating blues for rows as shown in the screenshot below, the report sent in Outlook would not be the same.

image

Troubleshooting

The following PowerShell script can call the Function App directly if the Logic App does not generate the report and you want to troubleshoot by calling the Function App directly.

GitHub: https://github.com/terenceluk/Azure/blob/main/Function%20App/Test-Calling-API.ps1$Body = @{

path = https://storageAccountName.blob.core.windows.net/integration/AD-Report-06-28-2023.json

}

$Parameters = @{

Method = "POST"

Uri = "https://youFunctionName.azurewebsites.net/api/Converter?code=xxxxxxxxxxxxm_Dnc_avHxxxxxxxxxxxxxxDH1A=="

Body = $Body | ConvertTo-Json

ContentType = "application/json"

}

Invoke-RestMethod @Parameters | Out-File "C:\Temp\Call-API.html"

The Function App URI can be located in the field shown in the screenshot below:

image