Bulk Test APIs Before Production — Azure DevOps Release Gate With Azure Durable Functions | by Marcus Tee | Apr, 2022

Validate the API in batch and only release the application to the production environment after validation automatically

Photo by Mikhail Fesenko on Unsplash

Release pipeline in Azure DevOps allows developers to deploy applications across multiple stages. The release pipeline is similar in the way that DevOps engineers can define automated tasks and jobs such as build process, unit test, integration test, etc.

In addition to the tasks, we can also define pre-deployment condition check and post-deployment condition check, before the package progresses to subsequent stages. The diagram below illustrates the flow visually.

Source: https://docs.microsoft.com/en-us/azure/devops/pipelines/release/approvals/?view=azure-devops

Under the conditions, we can either define human intervention, which requires approval from selected members, or define gates, which have several automation tasks that we can perform to check the condition before releasing the deployment.

This article will focus on Release Gates. The release gate can ensure an additional layer of inspection, whether it’s checking against compliance, or validation on application functionality or readiness, which can help to reduce issues after our deployment.

Scenarios

As part of the automated testing stage, developers will often include automated unit testing and integration testing within the pipeline. In terms of functional testing, there’s a mixture of both automated ways and getting testers to perform the testing. A fairly common functional testing is performing REST API calls, to ensure the services are working as intended.

Besides calling APIs for the systems, it can also involve calling external APIs, for example, checking the status of external endpoints, compliance check, etc.

There are several options available under the release gate, as shown in the diagram below.

Checking Azure Policy compliance is commonly used, as that allows customers to deploy workload in an environment that complies with organization standards. We can also invoke APIs via generic connection, or Azure Functions. Lastly, querying Azure Monitor alerts and work items us critical too to ensure the release to the new environment follows certain standards. You can get more details regarding the release gate here: https://docs.microsoft.com/en-us/azure/devops/pipelines/release/approvals/gates?view=azure-devops

Defining a generic connection for REST API is simple. We can just create a service connection at the project setting and define the server URL and authentication details.

This works for several APIs. We can define them one by one. How about 20, 30, or more APIs?

You may be wondering why don’t we do it within the pipeline then. Yes, we can include this activity within the pipeline. A major distinction here is the ability to determine whether the deployment can proceed to the next stage or environment, by defining the condition of the results, coupled with manual intervention on human approval.

Back to service connection. As shown in the diagram, effectively we need to define the endpoints one by one, which is not feasible if we are calling more than 10 endpoints. The question will then be, is there a way to orchestrate all the API calls and aggregate the responses, to serve as a release gate?

Solutions

Azure Functions is well known as serverless compute for “Functions-As-A-Service”, often used in microservices workload. It’s usually stateless and short-lived. Azure Durable Functions, on the other hand, is meant to handle the stateful job, in addition to the benefits of Azure Functions. Most notably, Azure Durable Functions excels in serving as an orchestrator to perform tasks in parallel, or in defined sequences. There are many application patterns for Azure Durable Functions for different use cases, documented here: https://docs.microsoft.com/en-us/azure/azure-functions/durable/durable-functions-overview?tabs=csharp

In this context, we can use Fan out/fan in pattern for this use case. The idea is simple. We invoke an orchestrator, which will then invoke the list of REST APIs, and finally aggregate the results and return as output.

Let’s see how this works in detail.

Azure Durable Functions Orchestrator

I will start with the template. Using VS Code or Visual Studio, I can select Azure Functions Template, and in this case, I selected Durable Functions Orchestration as the starting point. The following template is generated.

It will generate 3 functions, namely “DurableFunctionsOrchestration”, “DurableFunctionsOrchestration_Hello”, and “DurableFunctionsOrchestration_HttpStart“. Let’s understand them from the bottom.

DurableFunctionsOrchestration_HttpStart“ is defined as the entry point for the workflow, serving as a durable client. As shown on line 27, it’s using HTTP Trigger. What it means is that it will accept HTTP GET or HTTP Post, and start the orchestration flow. Of course, depending on the use case, we define other triggers too, such as timer trigger, Azure Service Bus trigger, etc. Here’s the list of bindings available, and you can use the supported binding for the trigger.

In this example, I will stick to HTTP trigger. It’s easy to invoke, and this HTTP trigger will return the orchestration details as HTTP response, such as orchestration status, output, etc, which is needed in this case.

Let’s move toDurableFunctionsOrchestration“. As the name suggests, this function orchestrates the activities. I can define how the activities are being orchestrated, such as running in asynchronisation mode or running in serial depending on the output of the previous activity. In the example above, from lines 10 to 12, it simply triggers the activity in serial, which is doing a function chaining. This is the place where we design one of the six patterns mentioned above.

The targeted activity from “DurableFunctionsOrchestration“is called”DurableFunctionsOrchestration_Hello”, taking a string as input. This is where all the heavy-lifting is performed.

Back to our scenario. The idea is to create a function, which aims to issue multiple API calls to validate the status and output and return the results back to Azure Pipeline release gate.

Now that we get the idea of ​​Azure Durable Functions, let’s see how we can implement it in this scenario. I will start with the activity, the functions that perform all the heavy lifting tasks.

I named it HealthCheck_Executor. For demo purposes, I simply do an echo to my backend API, validated that the returned message is the same as my input. The only additional action I did is to construct an object called APICheckObject, to capture API name and its status. Remember, I want to aggregate all results, later on, hence doing a 1 or 0 is a simple way, and at the same time, I can identify which API is not functioning as intended.

Next, let’s move on to the orchestrator. The first question to address is, what is the right pattern should I use? Function chaining and Fan out/fan in seems to be a viable candidate.

I will go for function chaining, if the APIs are tested in sequence, for example, the second API depends on the output of the first API. In my case, all the APIs are independent of each other, hence I opt for Fan-out/fan in, to parallelize all execution for a shorter execution time.

I will then return the result as a list of objects as the output of the entire process.

The last piece of the puzzle is a way to invoke this durable function. I will stick with HTTP trigger, but I need one more thing. If I call this durable function directly, it will return several URLs, mainly for us to check the status or terminate the task. Azure Pipeline release gate will not be able to consume the output.

To address this, I can simply create another Azure Functions using HTTP trigger, to invoke this durable function, and at the same time, aggregate the output for validation in Azure DevOps.

The full source code can be found on GitHub.

Azure Pipeline Release Gate Validation

Let’s inspect the output from the Azure Functions above. I have the total success call, as well as the respective API name and its status.

{
"totalSuccess":3,
"details":[
{"apiName":"test1","apiStatus":1},
{"apiName":"test2","apiStatus":1},
{"apiName":"test3","apiStatus":1}
]
}

In Azure Pipeline, let’s move to the release pipeline and modify our release gate. To add the Azure Functions in the release gate, we can either go for pre-deployment or post-deployment. In my case, I will go for post-deployment. Adding Azure Functions is simple. Simply click “+ Add” on the top left corner under “Gates”, and put in the Azure Functions URL and Azure Functions Key. If you don’t know where to retrieve the key, it’s inside Azure Portal.

I showed the API response just now, and now I can define the success criteria for this post-deployment check. I will check against the total number of successful calls, and the check is passed, if the success call is within my parameter. You can get more details on other ways of defining the condition in the documentation.

Here’s what my release gate looks like.

To validate it, simply create a release and check the logs. You will notice that the condition is checked twice, in 5 minutes intervals. You can define the parameter inside the deployment gate page above.

The diagram below visualizes the entire flow.

With that, now you can validate the API in batch and only release the application to the production environment after validation automatically!

Leave a Comment