Wednesday, July 22, 2020

Promoting Business Continuity Through Remote Operations – Assignment Tracking and Bonus Management

Business Case:

COVID-19 has affected businesses across the board, with governments around the world developing strict restrictions and advisories. Our client decided to close physical stores in reaction to COVID-19. As part of their Business Continuity Plan, store associates started working on their assignments remotely. Associates could pick from five different strategic priorities that align with overarching company goals. As a result, our client faced consolidation and work insight generation problems.


We created a Power Apps solution that allows our client to easily assign work to associates. The solution also lets associates opt in/out of the remote work program and tracks their eligibility towards bonuses they could earn as part of it.

Solution Design:

Our Power App offers role-based security primarily configured for Store Managers and Store Associates. Store Associates input their preferences while Store Managers assign tasks and bonuses. This information is stored in an Azure SQL database (DB) with the help of Microsoft Power Automate flows. The app also sends notification emails to Store Associates and the general store alias account. We built a Power BI report on top of the Azure SQL DB to meet client reporting needs and perform further analysis on the effectiveness of the program.
Figure 1: Solution Design

Application Preview:

The app provides separate interface experiences:

  Store Manager interface: Provides a list of store associates that have opted in for remote operations; used to generate assignments and bonuses. Additionally, it provides the ability to view activity history and update the opt-in/opt-out data of their store associates. 

Figure 2: Store Manager Profile

  Store Associate interface: Provides the option to opt in or out of remote capabilities. 

Figure 3: Store Associate Profile (Opted Out)

Figure 4: Store Associate Profile (Opted In)

Business Outcome:

Our solution made creating and tracking remote assignments easier. This optimized workflow management even during COVID-19 interruptions. Over 2.5K employees have onboarded this app. In the span of two months, the app has allowed our client to maintain business continuity with around 40K opt-in/opt-out transactions. This has enabled our client to maintain its business as usual while monitoring the program from a strategic lens to minimize losses.


1.    Developed low-code enterprise-grade solution with minimal time to market
2.    Optimized task assignment and bonus assessment during remote operations
3.    Enabled business continuity despite closure of stores due to COVID-19

Monday, July 13, 2020

QlikView to Power BI Migration

The steps we followed to carry out our migration strategy are detailed in this blog.

1. Requirements Gathering and Analysis

We needed to understand our client’s extant state of affairs. During the Requirement Gathering and Analysis Phase, we evaluated the client’s existing reporting platform. We examined the QlikView reports, usage, UI/UX, audience, data, and security implementation.

Through this we defined the scope and acceptance criteria, and created a report and data estate inventory. Altogether we had 16 reports to migrate. The data source consisted of 20+ SQL databases and 30 Excel sources.

2. Planning and Design

We performed a detailed gap analysis. This identified the different features, and visualization and modelling variations in Power BI as compared to QlikView. We highlighted alternatives to the challenges identified, including:

1.    QlikView provides ‘Clear Selections’ feature that clears all the filters on a report. We used the ‘Reset to Default’ option to reset all filters to default.
2.    QlikView supports dynamic variables in a report that can be expressed in visuals. We used ‘What-If’ parameters in Power BI that serve the same purpose
3.    QlikView has a feature that shows all the filters applied in a section. We used the new filter experience in Power BI that shows all the filters applied on a visual.

Through this, we were able to identify a template and prepare a mock-up of reports for review. We separated the datasets based on the reports’ refresh frequency (daily/hourly). We broadly classified the reports as “Financial” and “Operational.” We then sub-categorized them based on their complexity: “simple,” “medium,” and “complex.” We proposed a solution architecture and an execution timeline to the client.
Figure 1: Solution Architecture
Figure 2: Execution Timeline

3. Execution

We performed a detailed gap analysis. This identified the different features, and visualization and modelling variations in Power BI as compared to QlikView. We highlighted alternatives to the challenges identified, including:

1.    Sprint Plan – We set up weekly calls with the client, created a product backlog, and defined the scope and size of the sprint.
2.   Implementation – Using best practices, the proposed template, and theme, we began the report migration, providing incremental builds. We combined some of the simple reports with the same data source to form common data models.
3.    Performance Tuning – We combined the smaller datasets and optimized the data models to reduce data latency.
4.    Test – We used a set of in-house tools to perform automated testing that tracked query performance, and provided insights on visual layout and data validation.
5.    Deployment – We marked the closure of our sprints by deploying the reports and readying the build for UAT.

4. Deployment and Post-Production

This stage helped us ensure the end user was satisfied and comfortable with the new system. During the Deployment and Post-Production phase, we:

1.    Conducted user acceptance testing (UAT) through numerous user acceptance sessions.
2.    Got the sign off for production and UAT.
3.    Conducted a premium capacity analysis to establish a suitable refresh schedule for dataflows.

To complete the transfer of ownership, we handed off the code, reports, and workspace inventory to the client. We also proposed our decommissioning strategy for the older reports.

5. Decommissioning

In the Decommissioning Phase, we got the sign-off from stakeholders to systematically retire the old QlikView reports. We updated the end URL of the reports. We also tracked the usage of existing QlikView reports until we eventually brought those down to nil.


  Standardized reporting mechanism across the business divisions. 
  Reduced redundant reporting through report consolidation. 
  Enhanced security through role-based access control using RLS. 
  Cost reduction by enabling EF to retire QlikView and the maintenance of two BI tools. 

Tuesday, June 30, 2020

A Unified Interface that Monitors and Tracks Data Operations

Business Case:

IT project maintenance and continuance processes comprise multiple tasks that require regular monitoring. Monitoring and tracking are especially important when projects deal with real-time data.

A typical DataOps project might require tracking job statuses, database health, database access, server health, CPU usage, efforts, issues, and much more.

Monitoring becomes even more complicated when multiple development teams work under the same project, handling different streams and phases. Their data, resources, and deployment level interdependencies necessitate a high degree of collaboration. The effort required to manually track and monitor these details is exhausting, inefficient, and costly. Factors like refresh frequencies, requisite SLAs, and shifting stakeholder priorities complicate project structures.


We integrated our production infrastructure with the capabilities of Power BI and PowerApps to create a single view of all the parameters to:

1.    Simplify the monitoring and tracking interface
2.    Reduce manual intervention points
3.    Reduce process complexity
4.    Improve collaboration among dependent teams
5.    Fix recurrent issues and streamline data flow

Solution Design:

We used Microsoft Power BI to create reports using real-time data from SQL Server applications to track:

  Job running status for on-premise and cloud services (completed, in progress, failed, scheduled, disabled) 
  Job history 
  Database status (health, stale tables, job trends, data latency, database access) 
  Server status (health, memory usage, CPU usage) 

Figure 1: Solution Design

We created a PowerApps-backed application to register and track any issues and their status (open vs. resolved). Our solution lets users enter the cause of an issue and the action taken to resolve it. Users can also track the effort put in by team members to resolve said issues.

Business Outcome:

A single-point view reduced the time and complexity of tracking and monitoring 40+ streams, 250+ jobs, 150+ databases, and 70 servers in real-time. It facilitated better collaboration across all the dependent teams.

Our solution enables us to retrospect, analyze, identify, and isolate the root cause of recurrent issues and process bottlenecks. This allows us to fix them with reduced amounts of manual effort. By limiting manual touchpoints, we improved our efficiency by over 60%. Our solution also optimized resource utilization, lowered chances of error, and increased quality control.


1.    Our unified interface improved overall efficiency by over 60%, with optimized resource utilization, lower error rates and better quality control.
2.    The single-point view stream reduced the time and complexity of tracking and monitoring jobs, databases, and servers.

Tuesday, June 16, 2020

Everything You Need to Know About Migrating to Power BI

Power BI Migration Strategy:

Business Intelligence (BI) helps corporate executives, business managers, and other operational workers make better and more informed business decisions. Organizations use BI to cut costs, identify new business opportunities, and spot inefficient business processes.

BI tools used to be very difficult to use. Trained professionals needed a deep well of knowledge and expertise to create reports and dashboards. Minor tweaks and enhancements required a fresh development cycle.

BI reporting tools like Microsoft’s Power BI have evolved past that. Now, business users with minimal training can build and tweak their own dashboards and reports, without assistance from the BI team. Power BI and similar tools leverage the storage, computing power, and availability afforded by cloud platforms. Organizations can now deploy quickly and adjust usage on the go, without spending huge amounts of time and money upfront.

In this article, we will detail our experience migrating customers to Power BI. Our processes help customers enable data-driven culture in a way that is cost effective, secure, and scalable. We will examine each of the six migration stages in detail:

1.    Requirements Gathering and Analysis
2.    Planning and Design
3.    Execution
4.    Deployment and Post-Production
5.    Center of Excellence
6.    Decommissioning

1. Requirements Gathering and Analysis

The initial stage will always focus on understanding the needs of our customers. During the Requirement Gathering and Analysis phase, we evaluate the existing reporting platform. We examine reports, usage, UI/UX, audience, data, and security implementation.

Through this, we understand scale and complexity. This helps us define scope and acceptance criteria, and allows us to create an inventory of reports and data estate.

Example: In our most recent data migration projects, we addressed the following four challenges:

1.    High page load time
2.    Difficulty managing and maintaining multiple data sources
3.    Decentralized reporting platform
4.    High cost of maintaining different tech stacks

2. Planning and Design

Every customer has different needs and priorities. During the Planning and Design phase, we propose our solution, outlining the timeline and overall architecture. During this stage we also involve architects, data admins, infrastructure admins, legal and security teams, and the Power BI product team if required.

We divide this phase into five steps.

1.  Perform detailed gap analysis to identify the different features, and visualization and modelling challenges
2.  Propose architecture that covers security implementation, refresh latency, and performance level requirements
3.  Design report templates and prepare mock-ups
4.  Define scope for automated validation
5.  Arrive at a deployment strategy

3. Execution

During the Execution phase, we implement the approved solution architecture. To ensure proper workflow, we break down our procedure into accessible segments. We start with the environment setup to develop Power BI reports. We follow the agile methodology to determine sprint execution, with the following steps:

1.   Sprint plan – We create a product backlog and define the scope and size of the sprint
2.  Implement – Using best practices, standard templates, and themes, we start the report migration and provide incremental builds of the reports
3. Performance Tuning – We rethink the architecture and report layout, optimizing the data model to fine-tune the reports for best performance
4.  Test – We use a set of in-house tools to perform automated testing that tracks query performance and insights on visual layout and data validation
5.    Deploy – We mark the closure of our sprint by automating the report deployment and readying the build for UAT

4. Deployment and Post-Production

It is important to consider the end user when building a plan. During the Deployment and Post-Production phase, we:

1.   Conduct user acceptance testing (UAT) through numerous user acceptance sessions
2.   Get the sign off for UAT and production
3.   Automate the deployment of all reports to production for end users to consume

UAT allows technical users, business users, and operations team members to become familiar with the new data system as the old system is gradually moved offline.

To complete the transfer of ownership, we hand off the code, report, and workspace inventory to our customers.

5. Center of Excellence

We want our clients to develop into a self-sufficient team. During the Center of Excellence phase, we train customers by sharing best practices, and offering virtual training for administrative steps, governance, security, and understanding Power BI functionality. Our Center of Excellence (CoE) offers the following sessions:

  Admin CoE 

6. Decommissioning

When migrating platforms, it is important to be mindful of redundancies. In the Decommissioning phase, we systematically retire the old reports without impacting business. We follow a very planned approach during the transition to minimize impact to business.

Benefits of Migrating to Power BI

With this six-phase strategy, our customers are able to successfully migrate to a Power BI platform. This results in improved decision-making and positive business outcomes, including:

1.    Quicker insights for decision-making
  Improved and timely decision-making thanks to reduced latency between data sources and reports 
  Improved scalability 
2.    Self-Service BI
  Power users and end users can take advantage of the organization’s data assets to make data-driven decisions, and streamline operations 
  Users can derive key insights by combining data from disparate sources such as D365, Microsoft365, and hundreds of other relational databases 
3.    Centralized Reporting
  Ease of management and governance of the organization’s reports through centralized admin capabilities 
  Single source of truth for all reporting layers through shared datasets 
4.    Security Benefits
  Role-based access through RLS 

Tuesday, May 5, 2020

Make Power BI workspace governance easy

Power BI is one of enterprise BI's trusted reporting platforms. It offers powerful analytics and reporting features for organizations while requiring a minimal learning curve.

Intended Audience:

Enterprise customers of Power BI with both high adoption and usage rates need ways to simplify workspace governance. This blog provides an easy-to-use automated solution.

Business Scenario:

Most of our Fortune 500 customers use Power BI for analytics. The increased use of O365 groups makes workspace management difficult. With every Teams group creating their own Power BI workspace, manually managing these becomes tedious.

Workspaces establish collaborative environments where colleagues can create collections of dashboards and paginated reports. These collections can be bundled together into an app and distributed across the whole organization, or to specific people and groups.

To identify empty workspaces, report administrators had to navigate to the Power BI portal and manually check if a workspace was empty.

Technical Implementation:

We developed a web app that uses Power BI REST API calls to fetch workspace metadata, including information about the reports and dashboards present in them. After fulfilling prerequisites and raising access, the app presents a list of workspaces accessible to users in the tenant, based on user role. It uses Power BI Rest APIs to fetch information about whether workspaces have content (reports, datasets, dashboards) in them, displaying the data on a web page. Users can utilize a delete button attached to each workspace to delete the workspace from this centralized page.

Figure 1: Centralized page displaying workspace content information

The Power BI Rest APIs are internally configured to handle the conversion of groups from classic to V2 workspaces during deletion. As this change is immutable, we enabled a warning message to prompt users before deleting empty workspaces.

Access Instructions:

User Access

To access the application, users must sign in using their O365 accounts. To view the workspace details, the user only needs viewer access. However, in order to delete workspaces, the user also needs admin access.

App Permissions

To run the application, the user must grant consent on Workspace.ReadWrite.All and Report.ReadWrite.All permissions. These are required by the AAD app to generate the access token, which is used by the remaining API calls to successfully retrieve data. This enables the application to obtain the list of accessible workspaces. To authorize the application to call APIs that delete workspaces, users/admins need to provide Admin Consent. The list of configured permissions should include all permissions needed by the application, as shown below.

Figure 2: List of required permissions

Business Outcomes:

An automated approach:

  Reduces the effort in identifying and deleting empty workspaces 
  Enables role-based admin privileges to prevent misuse and unauthorized usage
  Enables workspace tracking in the organization 

Using the web-based tool, administrators can now clear out unused workspaces at the click of a button. By providing tenant-based role permissions, we ensure security and prevent unauthorized usage. Thus, implementation of the solution has helped reduce needless efforts, increasing overall productivity within the organization.

How to get the Solution:

Click here to gain access to the Codebase link


Feel free to fork the solution and contribute to it. If you come across any issues, please raise them in the issues section.

Monday, May 4, 2020

Page Load Time: Tracking the Performance of Power BI Reports

Power BI has become one of the preferred data visualization options across organizations. Many companies rely on the powerful analytics and reporting features offered by Power BI.

Business Scenario:

Most of our Fortune 500 customers use Power BI as their default reporting and analytics solution. It is important to ensure a great experience for users, who rely on these reports to drive their business decisions. One of the KPIs we track to ensure great user experience is the report load time.

Unfortunately, this process is complicated by a variety of factors including data volume, network bandwidth, and other system parameters. We needed an automated solution that could capture the page load time of Power BI reports with high accuracy, while also tracking performance improvements made by the development teams.

Existing Process:

Page Load Time (PLT) is commonly measured either manually through web page refresh or via the Performance Analyzer Tool. A manual calculation is not only inaccurate, but also highly inefficient. The Performance Analyzer tool, on the other hand, is available to just the Power BI Desktop; this limits its potential to gauge report performance over the Power BI service.

In addition, most of the custom solutions available today focus on measuring the whole page’s load time. This includes the loading of non-essential components of the Power BI service that are not critical to the actual load time of the report.

At present, there is no easy way to gauge the impact of incremental optimizations made to reports, nor a way to track improvements over the time.

Key Business Challenges:

  Manual effort – Reduce the effort required to manually capture the page load time of reports. 
  Accuracy – Examine only the important components of Power BI when calculating the load time. 
  Report tracking – Track the impact of modifications made to Power BI reports over time. 
  Benchmarking – Benchmark acceptable load time and track against that number. 
  Efficient tracking – Track page load time at a page level. 

What is the PLT in our context?

Our definition of PLT encompasses the end user experience when they load a Power BI report from their browser. This is marked by the total time taken between the first request sent by a browser and the last object rendered on the page.


We designed a browser extension that takes the report URL(s) and Google Chrome profile from users to deliver PLT results via a Power BI report.

Solution Design:

For developers, the browser extension can suffice to check current report load time. It has an easy-to-use input medium, which users can update with the necessary details and execute in the browser.

In order to track and perform trend analysis, we extended the solution to include an Azure database that stores PLT information. This includes elements like PLT capture, timestamps, and page URL(s). Users can configure information about what report URLs to track using an XML file. 

For those intending to institutionalize the load metric, we extended the solution to use Power BI REST API to track any new pages or reports added to the workspace on the Power BI service. This both tracks reports published on the workspace and measures load time. This data can be used as one of the metrics to accomplish before moving from UAT/Test to the production workspace.

Data Representation:

Once data is captured and added to the back end, it can be represented in a Power BI report. This method can highlight reports and/or pages with the highest PLT, and those with the lowest. We have listed some data representation options below that might appeal to organizations that want to track PLT centrally. 

You can show variance around benchmark by setting benchmarking at the report level. Representations of trend information highlight the top gainers and losers in terms of load time.

You can also set up emails of report thumbnails that send to specific recipients. These contain the metric information, URL, and indicators about whether a report loaded correctly.

Figure 1: PLT Compliance Report

Our Power BI compliance report allows leads to track the performance of their report over the course of two weeks.

Figure 2: Sample PLT tool report

Figure 3: Sample PLT Trends

Business Outcomes:

  Better visualization of report performance. 
  Better understanding of the impact modifications on reports have to performance over time. 

Through this solution, users can quantifiably track report performance. It provides data validation and tracking services. In addition to improving the performance of reports, our solution also improves overall user experience.