Deploying Azure Kubernetes Service Demo Store App with Azure Open AI – Part 1

I like to share a blog series sharing my experience deploying, running and learning the technical details of the AKS Demo Store App and how it works with Azure Open AI. This experience is based on the the MS Learn article https://learn.microsoft.com/en-us/azure/aks/open-ai-quickstart?tabs=aoai

This AKS Demo Store App can be found at https://github.com/Azure-Samples/aks-store-demo/tree/main

The functionality in this app uses Azure Open AI service to generate product descriptions based on the product name and tags/keywords. This is the AI functionality of the Demo Store’s product edit page has the ability to generate the description based on the product name and keywords.


The Demo Store application architecture is composed of multiple services deployed in their own container and running in a Kubernetes deployment.

My existing AKS cluster is in version 1.23.15 and with nodepool with 2 nodes with VM size Standard_B4ms which is small compute but good enough for demo purposes. I have already other applications running in this cluster.

I login into my AKS cluster

List my nodes to confirm I am logged in and view resources.

From the Azure Portal:

Create a new k8s namespace called pets and deploy into it with this demo store app with the k8s manifest yaml file – https://raw.githubusercontent.com/Azure-Samples/aks-store-demo/main/aks-store-all-in-one.yaml. No need to configure any specific parameters in the file.

kubectl create ns pets
kubectl apply -f https://raw.githubusercontent.com/Azure-Samples/aks-store-demo/main/aks-store-all-in-one.yaml -n pets

After deployed, I check for the k8s deployments:

Check for k8s pods in this pet namespace. All pods are running.

Check for k8s services. The store admin app is exposed to the internet with the public external-ips. The store-admin app is where you can AI generate product descriptions as shown earlier in this post.

Next deploy the AI service, open the ai-service.yaml and enter the values for the defined environment variables for the container: The AI Service generates the product description by sending a text completion prompt to the deployed Azure Open AI resource.

This can be found in the Azure Open AI service that I have already created. by going to the Keys and Endpoint blade.

And for the Azure_OpenAI_Deployment name, go to Model deployments, and I have already created a deployment I called chatgpt1 using the model gpt-35-turbo. You can create a deployment with other models such as gpt-4, text-davinci that is preferably the more newer models. A deployment in Azure Open AI is a set of configurations that indicate the base model, model version, content filter, token per minute rate limit and enabling dynamic quota. It is not enough to refer to a base model. Here are the properties of the my chosen chatgpt1 deployment:

You can read more about the base models at https://learn.microsoft.com/en-us/azure/ai-services/openai/concepts/models#model-summary-table-and-region-availability

As a result, here is my ai-service yaml with configuration:

Let’s apply the ai-service.yaml and show the ai-service pod running:

Keep in mind that the ai-service is connecting over the Microsoft backbone and not through any virtual network in this implementation. In production scenarios, it is better to connect to your Azure Open AI service through a private endpoint.

From Azure portal we can see the deployment in the pets namespace as follows:
Deployments

Pods

Services. The ai-service is called from the store-admin. And the application Python code in the ai-service makes http calls to the Azure Open AI resource which is external to the AKS cluster.

You can see the public IP of the store-admin as 4.172.1.155 in the above screenshot and open it in the browser. You see a list of products and click on one of them to edit it.

I add keywords as follows to come up with a description that takes those keywords into the prompt.

I click Ask AI Assistant and this AI calls out to the Azure Open AI service to generate a new description.

I think this description does indeed incorporate the keywords.

By going to Monitoring Insights in AKS, you can see logs of the ai-service container when I click on the Ask AI Assistant button again. You can see it is calling out to the Azure Open AI service displaying the generated text.

Turning to the Azure Open AI resource’s metrics, I can see the times the OpenAI requests were made.

I have shown my experience of deploying and testing out the AKS Demo store app in AKS and leverage Azure OpenAI service. Although this application is simple, AKS is scalable, extensible and resilient for intensive uses of Azure Open AI service.

In the next blog post Deploying Azure Kubernetes Service Demo Store App with Azure Open AI – Part 2, I will describe the application logic and implementation within the ai-service container with Python and Semantic Kernel.


One thought on “Deploying Azure Kubernetes Service Demo Store App with Azure Open AI – Part 1

  1. Pingback: Deploying Azure Kubernetes Service Demo Store App with Azure Open AI – Part 2 – Roy Kim on Azure and Microsoft 365

Leave a Reply