Tuesday, May 5, 2020

Make Power BI workspace governance easy

Power BI is one of enterprise BI's trusted reporting platforms. It offers powerful analytics and reporting features for organizations while requiring a minimal learning curve.

Intended Audience:

Enterprise customers of Power BI with both high adoption and usage rates need ways to simplify workspace governance. This blog provides an easy-to-use automated solution.

Business Scenario:

Most of our Fortune 500 customers use Power BI for analytics. The increased use of O365 groups makes workspace management difficult. With every Teams group creating their own Power BI workspace, manually managing these becomes tedious.

Workspaces establish collaborative environments where colleagues can create collections of dashboards and paginated reports. These collections can be bundled together into an app and distributed across the whole organization, or to specific people and groups.

To identify empty workspaces, report administrators had to navigate to the Power BI portal and manually check if a workspace was empty.

Technical Implementation:

We developed a web app that uses Power BI REST API calls to fetch workspace metadata, including information about the reports and dashboards present in them. After fulfilling prerequisites and raising access, the app presents a list of workspaces accessible to users in the tenant, based on user role. It uses Power BI Rest APIs to fetch information about whether workspaces have content (reports, datasets, dashboards) in them, displaying the data on a web page. Users can utilize a delete button attached to each workspace to delete the workspace from this centralized page.

Figure 1: Centralized page displaying workspace content information

The Power BI Rest APIs are internally configured to handle the conversion of groups from classic to V2 workspaces during deletion. As this change is immutable, we enabled a warning message to prompt users before deleting empty workspaces.

Access Instructions:

User Access

To access the application, users must sign in using their O365 accounts. To view the workspace details, the user only needs viewer access. However, in order to delete workspaces, the user also needs admin access.

App Permissions

To run the application, the user must grant consent on Workspace.ReadWrite.All and Report.ReadWrite.All permissions. These are required by the AAD app to generate the access token, which is used by the remaining API calls to successfully retrieve data. This enables the application to obtain the list of accessible workspaces. To authorize the application to call APIs that delete workspaces, users/admins need to provide Admin Consent. The list of configured permissions should include all permissions needed by the application, as shown below.

Figure 2: List of required permissions

Business Outcomes:

An automated approach:

  Reduces the effort in identifying and deleting empty workspaces 
  Enables role-based admin privileges to prevent misuse and unauthorized usage
  Enables workspace tracking in the organization 

Using the web-based tool, administrators can now clear out unused workspaces at the click of a button. By providing tenant-based role permissions, we ensure security and prevent unauthorized usage. Thus, implementation of the solution has helped reduce needless efforts, increasing overall productivity within the organization.

How to get the Solution:

Click here to gain access to the Codebase link


Feel free to fork the solution and contribute to it. If you come across any issues, please raise them in the issues section.

Monday, May 4, 2020

Page Load Time: Tracking the Performance of Power BI Reports

Power BI has become one of the preferred data visualization options across organizations. Many companies rely on the powerful analytics and reporting features offered by Power BI.

Business Scenario:

Most of our Fortune 500 customers use Power BI as their default reporting and analytics solution. It is important to ensure a great experience for users, who rely on these reports to drive their business decisions. One of the KPIs we track to ensure great user experience is the report load time.

Unfortunately, this process is complicated by a variety of factors including data volume, network bandwidth, and other system parameters. We needed an automated solution that could capture the page load time of Power BI reports with high accuracy, while also tracking performance improvements made by the development teams.

Existing Process:

Page Load Time (PLT) is commonly measured either manually through web page refresh or via the Performance Analyzer Tool. A manual calculation is not only inaccurate, but also highly inefficient. The Performance Analyzer tool, on the other hand, is available to just the Power BI Desktop; this limits its potential to gauge report performance over the Power BI service.

In addition, most of the custom solutions available today focus on measuring the whole page’s load time. This includes the loading of non-essential components of the Power BI service that are not critical to the actual load time of the report.

At present, there is no easy way to gauge the impact of incremental optimizations made to reports, nor a way to track improvements over the time.

Key Business Challenges:

  Manual effort – Reduce the effort required to manually capture the page load time of reports. 
  Accuracy – Examine only the important components of Power BI when calculating the load time. 
  Report tracking – Track the impact of modifications made to Power BI reports over time. 
  Benchmarking – Benchmark acceptable load time and track against that number. 
  Efficient tracking – Track page load time at a page level. 

What is the PLT in our context?

Our definition of PLT encompasses the end user experience when they load a Power BI report from their browser. This is marked by the total time taken between the first request sent by a browser and the last object rendered on the page.


We designed a browser extension that takes the report URL(s) and Google Chrome profile from users to deliver PLT results via a Power BI report.

Solution Design:

For developers, the browser extension can suffice to check current report load time. It has an easy-to-use input medium, which users can update with the necessary details and execute in the browser.

In order to track and perform trend analysis, we extended the solution to include an Azure database that stores PLT information. This includes elements like PLT capture, timestamps, and page URL(s). Users can configure information about what report URLs to track using an XML file. 

For those intending to institutionalize the load metric, we extended the solution to use Power BI REST API to track any new pages or reports added to the workspace on the Power BI service. This both tracks reports published on the workspace and measures load time. This data can be used as one of the metrics to accomplish before moving from UAT/Test to the production workspace.

Data Representation:

Once data is captured and added to the back end, it can be represented in a Power BI report. This method can highlight reports and/or pages with the highest PLT, and those with the lowest. We have listed some data representation options below that might appeal to organizations that want to track PLT centrally. 

You can show variance around benchmark by setting benchmarking at the report level. Representations of trend information highlight the top gainers and losers in terms of load time.

You can also set up emails of report thumbnails that send to specific recipients. These contain the metric information, URL, and indicators about whether a report loaded correctly.

Figure 1: PLT Compliance Report

Our Power BI compliance report allows leads to track the performance of their report over the course of two weeks.

Figure 2: Sample PLT tool report

Figure 3: Sample PLT Trends

Business Outcomes:

  Better visualization of report performance. 
  Better understanding of the impact modifications on reports have to performance over time. 

Through this solution, users can quantifiably track report performance. It provides data validation and tracking services. In addition to improving the performance of reports, our solution also improves overall user experience.

Monday, April 27, 2020

Optimizing the Release Management Workflow with PowerApps and Teams

Business Case:

A typical IT project delivery process comprises multiple phases from planning to deployment. Combined, these phases solve the objective behind controlled project delivery. However, when multiple independent teams work under the same portfolio, while handling different streams and phases, project delivery can get complicated.

In these scenarios, interdependency between the development and deployment teams necessitates numerous request and approval processes. Channeled through multiple tools and software, these increase the complexity in the overall process, and the time to delivery.


We brought together the capabilities of Power Platform and Microsoft Teams to build an interface that replaced multiple touchpoints between the development and deployment teams, thereby:

1.    Improving time to delivery
2.    Improving visibility of the deployment pipeline
3.    Reducing process complexity

Solution Design:

Current System:

With multiple teams in play, communication is key to initiate and track various activities such as code review, blocking slots for deployment, and build and release approvals. Communication takes place through a variety of platforms, including Outlook, Microsoft Teams, Azure DevOps, SharePoint, and Microsoft Forms.

A typical deployment activity involves the development team blocking a slot with the deployment team in the SharePoint calendar, and initiating code review in Azure DevOps. On deployment day, the development team communicates with the approvers for the build and release through Outlook or Microsoft Teams. The release team members then trigger the release pipelines through Azure DevOps.

Figure 1: Current release management process

New Single Platform Release Management System:

We built a unified interface, a Microsoft Teams channel integrated with PowerApps, to consolidate the dialog required between the development teams, deployment team, and the approvers. Development teams can now schedule their deployments using a deployment scheduler integrated into Microsoft Teams, all with a single click. The scheduler automatically updates the SharePoint Calendar and blocks the time with the deployment team. We used Power Platform to integrate the Azure DevOps code review, build, and release notifications to Microsoft Teams. These notifications are made available in a Microsoft Teams channel as a new conversation. Users have the capability to take the suitable approval actions, and trigger pipelines from within the Microsoft Teams interface.

As a result, all deployment activities are now seamlessly integrated into the Microsoft Teams channel. This eliminates the need for multiple touchpoints to accomplish a successful deployment.

Figure 2: New single platform release management

Business Outcome:

The unified interface reduced the time and complexity of release processes, while improving visibility across all the dependent teams. By limiting the touchpoints to a single interface, we achieved greater efficiency in release management and ensured better quality control.


1.    Configured PowerApps-backed application to handle release processes
2.  Streamlined the deployment process with fewer touchpoints and greater visibility across the pipeline

Friday, April 24, 2020

Optical Character Recognition - Event Attendance and Feedback Management

Business Case:

Our customer conducts an annual internal expo with numerous showcase events. Thousands of employees attend the events. It was a very time-consuming process to manually track attendance, send acknowledgement messages, get feedback, and compile insights from all the interactions.


We created a PowerApps based solution that used Microsoft Cognitive Services for NLP and Image processing, Logic Apps for automation, Microsoft Teams for post event discussions and Power BI for reporting insights.

Solution Design:

We built a check-in app that uses the device’s built-in camera to capture the identification badge of each participant. The captured images are stored in an Azure Blob. An Azure logic app reads the content from the captured images utilizing Optical Character Recognition (OCR) API to check participation and update database attendance records. After the event, notifications are sent to attendees via Teams to fill out a feedback survey using a Flow Bot. The Feedback App captures and reports the survey responses to determine the CSAT (Customer Satisfaction) score of the event.

There are multiple checkpoints incorporated in the system to ensure authentication. All attendees’ Teams and mailing details are securely stored in a database for the period of the event. While authentication occurred through Azure AD, experience owners could use their mobile phones to access information through PowerApps.
Figure 1: Solution architecture

Application Preview:

App provides two separate interfaces for experience owners (employees) and event organizers (admins).

  Check-in app provides experience owners with a list of planned events and allows them to mark attendance. Feedback tab allows them to provide feedback for the events based on experience area and showcase details. 
  Admin app provides control over the events being organized. Users can manage attendance, and make necessary alterations as required by performing manual mapping for unidentified/inappropriate images of scanned badges. 

Figure 2: Application screenshots

Business Outcome:

The app increased the ease of capturing the event attendance. Our client used session attendance information to evaluate session adequacy and effectiveness. Also, in-depth analysis on end user feedback helped the client identify focus areas to improve user experience.


1.    Developed low code enterprise grade solution with minimal time to market
2.    Optimized the efforts in attendance tracking and feedback management of events
3.  Facilitated admins and organizers to understand users and improvise relevance and impact of program
4.    Reduced manual error
5.   Improved the efficiency of recording attendance and performing feedback assessment

Monday, January 13, 2020

Machine Learning Forecasts Customer Sales

Business Case:

Our client runs a worldwide chain of retail stores and generates billions of dollars of revenue annually.

Our client needed to understand the impact of weather, promotions, discounts, product launches, holidays, and other events on sales. The client’s existing predictive sales model routinely underestimated sales volume at both the aggregated and daily level. Our client also needed to better understand the causes of seasonal and daily spikes in sales. For example, on December 25, 2018, actual sales in the US were 25% greater than predicted sales.

Existing Process:

Prior to working with MAQ Software, our client’s marketing and finance team used a statistical model that leveraged past sales and promotion data to forecast sales. The statistical model’s results were unreliable and inaccurate. With the previous system, sharing timely and accurate information with leadership team was difficult.

Key Business Challenges:

   Accuracy - Build a machine learning model to improve the accuracy of future sales predictions. 
   Reliability – Build a reliable forecast model to facilitate marketing, supply chain, and inventory decisions. 
   Simplicity – Build an automated framework that predicts future sales and retrains itself once actual sales are injected into the model.

Key Technical Challenges:

   Build a machine learning model that regularizes variables using the smoothness of predictor functions. 
   Build a machine learning model that analyzes and uncovers patterns in data for nonlinear fitting and predicts future sales using historical data points as inputs. 
   Examine the correlation between weather data (precipitation, temperature, pressure, wind speed, cloudiness, and so on) and sales at a specific longitude and latitude.
   Analyze the impact of factors such as product launches, promotions, discounts, and holidays on predicted sales.
   Include seasonality variables to explain seasonal fluctuations in the sales time series.


We worked with the client’s marketing operations and finance team to collect and analyze their sales data, promotion and discount data, and store events data. We also used NOAA historical weather data from the US government to develop the historical weather model. We extrapolated the historical data and used APIs to connect the data to our machine learning model to predict future weather.

Our team created a hybrid ML model that predicts future sales far more accurately than the previous ML model. The prediction ran on an aggregated and daily basis, and the model retrains itself once actual sales figures are injected into the model.

Our model’s MAPE (Mean Absolute Percentage Error) value was 0.09—as compared to the previous model’s MAPE value of 0.13. (A lower value indicates greater accuracy). 

Key Highlights:

   Used R libraries and custom functions to cleanse and preprocess the data. 
   Used descriptive statistical analysis to tackle skewness and kurtosis for the features. 
   Incorporated weather data to train the model and analyze the impact of weather on sales..
   Performed Fourier transforms to decompose sales, analyze trends, and remove noise from the sales time series.
   Forecasted sales depending on weather variations for the client’s store at a specific longitude and latitude.
   Applied logarithmic, exponential, and S-curve transformations to features to introduce nonlinearity as per real life scenarios.
   Developed hybrid regression models to predict upcoming future sales using nonlinear, multiplicative, probabilistic, regularized, and deep learning approaches.

Our Sales Forecasting Engine, built on Microsoft Azure Databricks, allowed our client to align their business objectives with predicted sales. Figure 1 shows the architecture of our Forecasting Engine.
Figure 1: Architecture of Forecasting Engine

Business Outcomes:

Our supervised machine learning predictive model empowered our client to analyze the impact of weather, promotions, discounts, product launches, holidays, and daily events on sales and execute business decisions accordingly. The model also identified the delay between an event and the seasonal spike, which empowered our client to maximize sales following an event. 

Outcome Highlights:

   Forecasted sales depending on weather variations for the client’s store at a specific longitude and latitude 
   Analyzed the positive and negative impacts of daily events such as discounts, promotions, launch events, and holidays on predicted and actual sales.  
   Identified and explained seasonal spikes in sales time series statistically.
   Identified the lag period for daily events to explain the behavior in time series.

Thursday, January 2, 2020

Supervised Machine Learning Model Forecasts Impact of Co-Sell Deals on Sales revenue using Azure Databricks

Supervised Machine Learning Model forecast impact of Strategic deal on Sales:
Business executives require visibility into the impact of decisions at all levels of their companies. Executives appreciate having this visibility at each stage of a project,  from initiation to  closing .  Reports showcasing  the results of business decisions across the entire organization help with such visibility. This case study reviews one level of a management decision chain, where revenues are based on sales managers’ offers to customers.

Business Scenario:
Our client, a team of sales managers at a multinational software organization, sells software product solutions to retail customers via dealers (co-sellers).  Our client needed to quantify the value of successful deals on the revenue driven by end retail customers. Also, our client wanted to know how much  their collaborative business model with dealers drove revenue. With this information, the client could focus on the factors with a higher impact on sales.

To provide our client with insights on co-sell deals, we created a Power BI dashboard powered by a machine learning model.

How We Did It?


Data Gathering: We realized that we needed additional data points to generate a successful and accurate model. We integrated different data sources to ensure access to important key features, such as customer geography and the status of deals between clients and dealers.

Data Consolidation: We performed a deep dive into the customer data set to find basic insights and possible key features that might impact revenue. We collected data from various systems, such as CRM and sales systems, which we used in subsequent steps for data engineering.

Feature Engineering: After gathering the required inputs, we feature engineered our input dataset and used Databricks to create a forecasting model using linear regression. We used existing and custom regression techniques to improve the forecasting model’s accuracy.

OptimizationWe cut down the model run time from 3 hours  to 15 minutes by parallelizing  the forecasting algorithm for each customer. This optimization provided the customer with near real-time analytics information. 

Key Challenges:
The major challenge we faced was forecasting revenue using existing data sources.

Another key challenge involved the model’s run time. The model was based on existing forecasting libraries of ‘R’ environment, which were not as agile as we wanted. We modified the ‘R’ libraries to ‘Spark R’,  enabling the model to run and provide forecasts quickly. We also had to improve scalability , considering future scenarios when our model would run not just on thousands, but millions of records. We achieved scalability using Azure Databricks, thereby providing a complete cloud-based, near real-time analytics solution.

Business Outcomes:
The end result was a Power BI report which helped our client understand the lifetime value and growth trend of co-sell wins. The report also provided the functionality to slice and dice the dataset with built-in filters for further insights.

Our dashboard allowed our customer to visualize impact by fiscal year, customer details (such as area, region, subregion, subsidiary, segment, and subsegment), and performance.

Our supervised machine learning model resulted in five benefits for our client:
1.    Optimal cost through end to end model on cloud with smart infrastructure
a.    Automated SKU scalability with accelerated ML model run.
2.    Quicker insights from accelerated model run
a.    Model run time reduced from 3 hours to 15 min with parallel data processing.
b.    Option for scalability of input data, leading to future support for growing data volumes.
3.    Near real-time impact analysis for input data
a.    We were able to achieve near real time modelling and forecasting.
4.    Ability to develop marketing strategy based on:
a.    Geography of end customer.
b.    Strategic relevance of partner for business.
c.     Lifetime and time-based impact of deal on sales revenue.
5.    Enhanced security through role-based access control, so that current and forecasted revenue data is safe from unauthorized access.

Wednesday, November 13, 2019

Engineering Insights: WPF Application Optimizes and Eases Delivery

Business Scenario

As a part of support services, our client provides customers with troubleshooting services and the ability to proactively check the health of software installed on their system.

To use the platform, users traditionally had to perform the following steps:

1. Download and install software drivers on one or more data collection machines.
2. Run PowerShell commands to set up the software.
3. Run PowerShell commands with over ten parameters to configure the health check.
4. Run PowerShell commands to start the health check.
5. Run PowerShell commands to stop the health check.

Even after completing the steps, users did not know whether the health check successfully completed. The actual execution of the health check required several hours.

Key Challenges:

Our client faced three key challenges:

1. Downloading and installing the drivers and scripts and configuring and running the health check required significant time. As a result, many users abandoned the health check and did not complete the process.
2. The client's engineering team often needed to support their customers through the setup, configuration, and execution steps. As a result, the engineering team could not always focus on building new features and improving the platform. 
3. Users frequently created support tickets after following incorrect steps for the setup. The high number of support tickets led to significant support costs. 


Our client wanted to help customers by providing an automated method to configure the health check software. They needed an application to guide customers through the prerequisites, setup, configuration, and execution steps.

To address our client’s challenges, we needed to understand our client’s customers’ obstacles. We shadowed the engineering team and identified five common challenges.

1. Customer machines often did not possess the prerequisites necessary for running certain health checks. 
2. Customer machines sometimes contained policies that interfered with health check execution.
3. Customers often did not provide the parameters required for health checks for specific products.
4. Customers often entered credentials in PowerShell commands incorrectly.
5. Customers struggled to know whether they successfully completed the health check.

Armed with this knowledge, we developed an easy to use Windows Presentation Foundation (WPF) application that simplified customer experience considerably. This decreased the amount of effort required of customers. The WPF application resulted in a seamless end-user experience. 

Key Highlights:

1. Our application provides a one-click installation solution, removing the need for multiple PowerShell steps.
2. Our application provides clearly named menu options and buttons for users to evaluate and install all prerequisites and necessary drivers. Our application also allows users to enable and disable relevant policies and other settings. 
3. Our application allows users to specify all parameters, providing a customized install experience.
4. To simplify the health check, we separated the setup process into stages and visualized the stages with various colors to signify success, failure, or progress.
5. Our application validates all user inputs, identifies mistakes and invalid values, and assists users with troubleshooting using tooltips.
6. After users complete the health check, our application performs several tests and provides a confirmation of successful setup. If the setup is unsuccessful, the application provides a list of issues encountered, along with documentation for correction.
7. Our application also provides users with the ability to set up multiple health checks.

Business Outcomes:

The application we developed is available for download on our client’s website. Our client received many accolades for the application from their engineering teams, support teams, and customer-facing community. The application resulted in increased goodwill and efficiency for both our client and their customers.

1. Over ten percent of customers downloaded and used the application within a week of launch.
2. Application downloads and usage continue to increase.
3. Platform usage increased, and the completion percentage of health checks improved.
4. Reduced the number of support requests.