Pages

Showing posts with label PowerShell. Show all posts
Showing posts with label PowerShell. Show all posts

Monday, January 29, 2024

Updating aztfexport generated "res-#" resource names with PowerShell scripts

Happy new year! It has been an extremely busy start to 2024 for me with the projects I’ve been involved it so I’ve fallen behind on a few of the blog posts I have queued up since November of last year. While I still haven’t gotten to the backlog yet, I would like to quickly write this one as it was a challenge I came across while testing the aztfexport (Azure Export for Terraform) tool to export a set of Azure Firewall, VPN Gateway, and VNet resources in an environment. The following is the Microsoft documentation for this tool:

Quickstart: Export your first resources using Azure Export for Terraform
https://learn.microsoft.com/en-us/azure/developer/terraform/azure-export-for-terraform/export-first-resources?tabs=azure-cli

Those who have worked with this tool will know that the exported files it creates names the resource names of the resource types identified for import as:

  • res-0
  • res-1
  • res-2

… and so on. These references are used across these multiple files:

  • aztfexportResourceMapping.json
  • import.tf
  • main.tf


While the generated files with these default names will work, it makes it very difficult to identify what these resources are. One of the options available is to go and manually update these files with search and replace but any amount of over 20 resources can quickly because tedious and error prone.

With this challenge, I decided to create 2 PowerShell scripts to automate the process of searching and replacing the names of res-0, res-1, res-2 and so on. The first script will parse the import.tf file:


… and extract the fields “id” and “to” into 2 columns, then create and addition 2 columns that contain the “res-#” and the next containing the name of the resource in Azure to a CSV:


If the desire is to use the Azure names as the resource name then no changes are required. If alternate names are desired, then update the values for the Azure Resource Logical Name in the spreadsheet.

The second script will then reference this spreadsheet to search through the directory with the Terraform files and update the res-# values to the desired values.

The two scripts can be found here in my GitHub repo:

Create the CSV file from Import.tf - Extract-import-tf-file.ps1
https://github.com/terenceluk/Azure/blob/main/PowerShell/Extract-import-tf-file.ps1

Replace all references to res-# with desired values - Replace-Text-with-CSV-Reference.ps1
https://github.com/terenceluk/Azure/blob/main/PowerShell/Replace-Text-with-CSV-Reference.ps1

I hope this helps anyone who may be looking for this automated way to update exported Terraform code.

Sunday, October 22, 2023

Deploy a ChatGPT service with Azure OpenAI Service in 6 minutes with PowerShell

OpenAI’s ChatGPT has been one of the most talked about services since its launch on November 30th, 2022 amongst my professional contacts as well as personal friends. What this Chat Generative Pre-trained Transformer can perform is truly remarkable and opens up so many possibilities in the future. Many of my colleagues have asked me whether I’ve tested it and why I haven’t written any blog posts since Azure released the OpenAI service preview in March 2023. The short answer is that I have performed some testing with it over the last few months but haven’t been able to commit the amount of time I want due to my busy work schedule. I finally had a bit of a breather over the past few weeks so I’ve managed to really try out the following:

  1. Pairing with Cognitive Search with a RAG (Retrieval Augmentation Generation) architecture to augment the ChatGPT LLM (Large Language Model) to add data in a Azure Storage Account
  2. Deploying front-end UI solutions for the OpenAI service
  3. Diving deep into how to secure Azure OpenAI, Cognitive Searches, and data sources with private endpoints and shared private access

It’s amazing how much material there is for #1 and #2 but not as much as I’d like for #3. There is so much Azure’s AI Services can do and I look forward to the projects to come in the following years.

The purpose of this blog post is to show just how fast and easy it is to deploy an Azure OpenAI service with a front-end UI for a private ChatGPT service where internal employees of organizations can safely enter questions with sensitive data. Microsoft is very clear on the usage of the inputs entered in the prompt (https://learn.microsoft.com/en-us/legal/cognitive-services/openai/data-privacy):

Your prompts (inputs) and completions (outputs), your embeddings, and your training data:

  • are NOT available to other customers.
  • are NOT available to OpenAI.
  • are NOT used to improve OpenAI models.
  • are NOT used to improve any Microsoft or 3rd party products or services.
  • are NOT used for automatically improving Azure OpenAI models for your use in your resource (The models are stateless, unless you explicitly fine-tune models with your training data).
  • Your fine-tuned Azure OpenAI models are available exclusively for your use.

The Azure OpenAI Service is fully controlled by Microsoft; Microsoft hosts the OpenAI models in Microsoft’s Azure environment and the Service does NOT interact with any services operated by OpenAI (e.g. ChatGPT, or the OpenAI API).

This will put many organizations at ease as I’ve been to one too many dinner parties where I’ve heard people talk about entering data into OpenAI’s ChatGPT to write a letter to HR. I don’t even want to ask what they were entering in there and what else it has been used for.

In any case, I took some time to put together a PowerShell script that prompts for a few questions about what to name the resource group containing all the resources to be created, the name of the Azure OpenAI instance, the LLM model to use, what Azure subscription to use, and it takes care of the rest (Container App, Log Analytics Workspace, etc). I timed the duration of the script and it took 5 minutes and 32 seconds to run. Yes, I understand this is an imperative run rather than declarative. I’m a huge supporter of Infrastructure of Code but I needed something that would allow me to run in any Azure environment to quickly build a demo with all components in a Resource Group so I can easily tear it down by simply deleting the RG.

The deployment is very basic with no private endpoints as I will reserve that for a future post. Here is the simple topology:

image

With that, let’s get into it now.

Prerequisites

As of October 22, 2023, you may see the Azure OpenAI service as an option in the Azure AI Services blade but attempting to create the service will display the following message:

image

Azure OpenAI Service is currently available to customers via an application form. The selected subscription has not been enabled for use of the service and does not have quota for any pricing tiers. Click here to request access to Azure OpenAI service.

image

Clicking on the link will bring you to a Microsoft Form with questions about who you are, why you want to use the service, and what features you would want to turn on:

image

**I’ve blocked out the content in the screenshot of the form as I am unsure if posting the verbiage is in violation of Microsoft’s policy.

You’ll need to fill out the form, submit it, and receive an approval that is indicated to take up to 10 business days. My form submission took only a day but I assume this can vary so if you fill out the form intend on using the service so you don’t have to wait when you actually want to deploy.

Using a PowerShell script to deploy all the services in 6 minutes (or less)

The PowerShell script I put together can be retrieved from my GitHub repository here: https://github.com/terenceluk/Azure/blob/main/AI%20Services/Deploy-Azure-OpenAI-with-Chatbot-UI.ps1

The script is meant to be executed from the console and it will ask for the user to input:

  1. Select a subscription found in the tenant
  2. Provide a name for a new Resource Group
  3. Provide a name for the OpenAI instance
  4. Select a model from the options
image

The rest of the components such as Container App and Log Analytics will be automatically named (derived from the instance name) and deployed through the remaining script. At the end of a successful run, the browser will automatically launch and the following screen will be displayed:

imageimage

Azure Resources Deployed

All of the resources for the solution are meant to be deployed into a single resource group for ease of cleanup if it is used for a demo:

image

The following are screenshots of the resources:

image

image

image

Note that the script will not place the value of the Azure OpenAI key into the environment as a variable, rather, it will store it as a secret that the environment variable references:

image

image

I did not create a custom health probe so the one created is the default:

image

Securing the ChatGPT UI portal with authentication

One of the components I’m still working on is to use BICEP to configure the Container App with Microsoft as an identity provider so the portal would prompt the user for credentials and they are required to log in with a valid account in the tenant’s Entra ID / Azure AD before getting into the portal. If you’d like to turn this on after the script deploys the services, simply navigate to the Container App’s Authentication blade, click on Add identity provider:

image

Select Microsoft as the Identity provider:

image

You can leave the settings as default and proceed to create the identity provider:

image

This will create an App Registration in the tenant’s Entra ID / Azure AD for the Container App to authenticate the user:

image

Note that you would need to consent the Container App’s App Registration in the portal.azure.com or perform it upon first logging in:

image

Credits

I want to give a huge thanks to Mckay Wrigley (https://github.com/mckaywrigley) for developing and sharing out his chatbot-ui docker container (https://github.com/mckaywrigley/chatbot-ui) for the world to use. If you search the internet for deployment demonstrations, you are bound to see 9 of the 10 demos using his Chatbot UI. I spent quite a bit of time using Postman to interact with the Azure OpenAI service APIs and as I am not a developer, it would take me quite a bit of time to develop something half as great as Mckay’s.

Final Remarks

One of the behaviors I noticed during the creation and deletion of the services is that when an Azure Open AI instance is deleted, it is dropped into a recycling bin like location and if you decide to deploy another instance in the same name then it will fail. If you have deleted and instance and want to use the same name then use the Manage deleted resources in the Azure OpenAI blade to locate and purge the instance. From what I can tell, the purge is instant and you can proceed to redeploy a new instance with the same name.

image

I hope this provides anyone out there who is looking to test this great service offering out but haven’t had the time to get started. There are many other great posts I’d like to write about Cognitive Search and the “under the hood view” of the traffic flow but I will save that for another day. Happy chatting!

Friday, October 6, 2023

Using PowerShell to create multiple Azure Storage Account Containers with Metadata using a list on an Excel spreadsheet

I recently worked on a project where we had to create hundreds of containers in multiple Azure Storage Accounts because we needed to the storage account SFTP service and in order to jail users into their own directories, each local SFTP user account needed to have their home folders set to their own containers. This may change in the future but working with this requirement meant many containers had to be created. In addition to creating containers, I also wanted each to have metadata added for the organization that the container belonged to so to reduce the repetitive manual labour, I decided to write a script.

The script can be I created can be found at my following GitHub repo: https://github.com/terenceluk/Azure/blob/main/PowerShell/Create-Storage-Account-Container.ps1

The format of the spreadsheet should look as such:

image

To handle scenarios where new storage account containers are added at a later time after the script has been executed once, the code will check and skip the creation of the container if it already exists.

Scenarios where I’ve noticed this script will fail is when there are non ASCII characters in the metadata value. These characters include languages such as French (é É) or Microsoft Word dash/hyphen character. I don’t think there is a way to have the PowerShell cmdlet accept these characters.

Below is an example of when these special non ASCII characters are encountered and the metadata value add fails. Note that the container does get created.

Container 17689 has been created.

MethodInvocationException:

Line |

40 | $container.BlobContainerClient.SetMetadata($metadata, $null) …

| ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

| Exception calling "SetMetadata" with "2" argument(s): "Retry failed after 6 tries. Retry settings can be adjusted in ClientOptions.Retry or by configuring a custom retry policy in ClientOptions.RetryPolicy. (Request headers must contain only ASCII characters.) (Request headers must contain only ASCII characters.) (Request headers must contain only ASCII characters.) (Request headers must contain only ASCII characters.) (Request headers must contain only ASCII characters.) (Request headers must contain only ASCII characters.)"

Container 17689 has been created.

image

Hope this helps anyone who might be looking for a script like this.

Thursday, September 28, 2023

Using an Azure Function App to automatically Start and Stop (Deallocate) Virtual Machines based on tags

One of the most common questions I’ve been asked in the past when it comes to cost management in Azure is what are the options for powering off and on virtual machines based on a schedule. A quick Google search on the internet for this would return a mix of Azure Automation Accounts, the Auto-Shutdown feature blade within the VM (only powers off but not power on), Logic Apps, the new Automation Tasks, and Azure Functions. Each of these options have its advantages and disadvantages, and the associated cost to execute them. Few of them have limitations on the capabilities available for this type of automation. As much as I like Logic Apps because of it’s visual and little to no code capabilities, I find it a bit cumbersome to configure each step via a GUI and the flow of the Logic App can quickly become difficult to follow when there are multiple branches of conditions. My preference with most automation are Function Apps because it allows me to write code to perform anything I needed. With the above described, it’s probably not a surprise that this post is going to demonstrate this type of automation with an Azure Function App.

The scenario I want to provide is an ask from a client who wanted the following:

  1. Auto Start virtual machines at a certain time
  2. Auto Deallocate virtual machines at a certain time
  3. Capability to set start and deallocate schedules for weekdays and weekends
  4. Capability to indicate virtual machines should either be powered on or deallocated over the weekend (they had some workloads that did not need to be on during the week but had to be on over the weekend
  5. Lastly, and the most important, they wanted to use Tags to define the schedule because they use Azure Policies to enforce tagging

There are plenty of scripts available on the internet that provides most of the functionality but I could not find one that allowed the type of control over the weekend so I spent some time to write one.

Before I begin, I would like to explain that the script I wrote uses the Azure Resource Graph to query the status of the virtual machines, their resource groups, and their tags because I find the time it takes for ARM to interact with the resource providers can take a very long time as compared to interacting with the Resource Graph, which is much faster. Those who have used the Resource Graph Explorer in the Azure portal will recognize the KQL query I used to retrieve information. What’s great about this approach is that we can test the query directly in the portal:

imageimage

The design of the tagging for the virtual machines to control the power on, deallocate, and scheduling are as follows:

Tag

Value

Example

Purpose

Behavior

WD-AutoStart

Time in 24 hour format

08:00

Defines the start of the time when the VM should be powered on during the weekday

This condition is met if the time is equal or past the value for Monday to Friday

WD-AutoDeallocate

Time in 24 hour format

17:00

Defines the start of the time when the VM should be powered off during the weekday

This condition is met if the time is equal or past the value for Monday to Friday

WD-AutoStart

Time in 24 hour format

09:00

Defines the start of the time when the VM should be powered on during the weekend

This condition is met if the time is equal or past the value for Saturday and Sunday

WD-AutoDeallocate

Time in 24 hour format

15:00

Defines the start of the time when the VM should be powered off during the weekend

This condition is met if the time is equal or past the value for Saturday and Sunday

Weekend

On or Off

On

Defines whether the VM should be on or off over the weekend

This condition should be set if a weekday schedule is configured and the VM needs to be on as it is the condition to turn the VM back on after the power off on a Friday

The following is an example of a virtual machine with tags applied:

image

With the explanation out of the way, let’s get started with the configuration.

Step #1 – Create Function App

Begin by creating a Function App with the Runtime stack PowerShell Core version 7.2. The hosting option can either be consumption, premium, or App Service Plan but for this example, we’ll use consumption:

image

Proceed to configure the rest of the properties of the Function App:

imageimage

I always recommend turning on Application Insights whenever possible as it helps with debugging but it is not necessary:

image

You can integration the Function App with a GitHub account for CI/CD but for this example we won’t be enabling it:

image

Proceed to create the Function App:

image

Step #2 – Turn on System Assigned Managed Identity and Assign Permissions

To avoid managing certificates and secrets, and enhance the security posture of your Azure environment, it is recommended to use managed identities wherever possible so proceed to turn on the System assigned managed identity in the Identity blade of the Function App so an Enterprise Application object is created in Azure AD / Entra ID, which we can then use to assign permissions to the resources in the subscription:

image

You’ll see an Object (principal) ID created for the Function App after successfully turning on the System assigned identity:

image

Browsing to the Enterprise Applications in Entra ID will display the identity of the Function App:

imageimage

With the system managed identity created for the Function App, we can now proceed to grant it permissions to the resources it needs access to. This example will assign the managed identity as a Virtual Machine Contributor to the subscription so it can perform start and deallocate operations on all the virtual machines. Navigate to the subscription’s Access control (IAM) blade and click on Role assignments:

image

Proceed to select the Virtual Machine Contributor role:

image

Locate the Function App for the managed identity and save the permissions:

image

Step #3 – Configure the Function App

It’s currently September 27, 2023 as I write this post and I noticed that the Function App page layout and blades have changed. The Functions blade under Functions option no longer appears to exist so create the application by selecting Overview, under the Functions tab, click on Create in Azure Portal:

image

The type of function we’ll be creating will be the Timer Trigger and the Schedule will be configured as the following CRON expression:

0 0 * * * 0-6

The above CRON expression allows the function to run at every hour on every day, every month, Sunday to Saturday (every day, every hour).

image

Once the Function is created, proceed to click on Code + Test:

image

The code for this the Function can be copied from my GitHub repo at the following URL: https://github.com/terenceluk/Azure/blob/main/Function%20App/Start-Stop-VM-Function-Based-On-Tags.ps1

Make sure you update the subscriptions list and the timezone you want this script to use for the Tags:

image

Save the code and navigate back out to the Function App, select App Files, then select the requirements.psd1 in the heading to load the file. Note that the default template of this file has everything commented out. We can simply remove the hash character in front of 'Az' = '10.* ' to load all Az modules but I’ve had terrible luck in doing so as the process of downloading the files would cause the Function App to timeout. What I like to do is indicate exactly what modules I need and specify them.

image

The following are the modules my PowerShell script uses so proceed to copy and paste module requirements into the requirements.psd1 file:

'Az.Accounts' = '2.*'

'Az.Compute' = '2.*'

'Az.ResourceGraph' = '0.*'

image

Save the file and then switch to the host.json file:

image

As specified in the following Microsoft documentation: https://learn.microsoft.com/en-us/azure/azure-functions/functions-host-json#functiontimeout, , we can increase the default timeout of a consumption based Function App by adding the following attribute and value into the file:

{

"functionTimeout": "00:10:00"

}

Proceed to add the value to handle large environments that may cause the Function App to exceed the 5-minute limit:

image

Save the file and navigate back to the Function, Code + Test blade, and proceed to test the Function:

image

The execution should return a 202 Accepted HTTP response code and the virtual machines should now be powered on and off at the scheduled time:

image

I hope this blog post helps anyone who might be looking for a script that can handle weekend scheduling of VM start and stop operations.