November 4, 2022

ADX Implementation for a Global Cloud Management Software Provider

A system bogged down by requests and no existing solution in sight

As a global cloud management software provider, our customer helps IT service providers build and elevate their successful cloud businesses, and they have one primary goal—making their partners security compliant. These partners measured the compliance status by means of three different security metrics: Current Average Security Score, Average Security Score, and Max Recorded Security Score.

Day in and day out, our customer would receive countless requests from its partners (in JSON format) to process these security scores. Though able to respond to some requests, it became clear that the existing system was not scalable and couldn’t possibly respond to all these requests simultaneously. To solve the problem, the customer wanted a solution that could handle multiple requests and refresh data for near real-time reports.


The ask

·         Optimize and scale the existing data pipelines to handle up to 250 simultaneous requests.

·         Have near real-time data refresh for batch requests.

·         Publish real-time reporting insights for further analysis and decision-making.



Figure 1: Solution Architecture

Our winning solution

After a deep-dive analysis was performed by our team, the following solutions were identified to overcome the key challenges:

For pulling data from upstream sources and sending data to storage…

We conducted explorations using Azure Data Factory (ADF), Logic apps, and Function apps, finalizing ADF as the most suitable option for this functionality.

For data storage and processing…

We identified that Azure Data Explorer (ADX) provided the required degree of real-time reporting in contrast to other options. With this knowledge, we designed a cost-efficient solution using ADX for storage and processing that was reliable and secure.

For maximizing performance and minimizing issues…

To ensure that there are no performance issues with the system while maintaining a queue of requests, we used Azure Event Grids. The pipelines were implemented with the capability to handle both scheduled batch loads and just-in-time (JIT) requests. With scheduled batch loads, data was pulled for a defined timeframe, and for just-in-time requests, data was pulled once the request was received.

For all-around security…

To secure data transmission, secret key combinations were used to send calls to the Event Grid. All transmissions to and from the Power BI Service were encrypted.

And for everything else?

Power BI Embedded was used to publish reports to the web application, while Row Level Security (RLS) was implemented to enhance security.


Figure 2: Outcomes
Figure 3: Key Technologies