Monday, October 11, 2021

What is ‘Tech Intensity’ and Why Won’t the Tech Industry Stop Talking About it?

If you work in tech, you’ve probably come across the term ‘tech intensity.’ It’s been the hot-button topic of the last couple years, this decade’s version of “digital transformation.” But what is it? Buzz word or business term? Hype or strategy?

Starting at the beginning: where did ‘tech intensity’ come from?

The phrase “tech intensity” was first introduced by Microsoft CEO Satya Nadella in his keynote speech for Microsoft Ignite 2018. He based the concept off research by Dartmouth economist Diego Comin, who analyzed the role of rapid industrialization in determining a country’s economic success. According to Comin, speed is not the only important success factor. What’s key is the intensity of technological adoption: the ability to train people to use it.

Digital transformation has been the hallmark of a modernized company for decades. In his speech, Nadella suggested that digital is now the norm. To consider themselves innovators, companies need to go above and beyond the norm. They need to invest in tech intensity.

So, what is tech intensity?

In the simplest terms, tech intensity is a company’s ability to adopt and fully integrate emerging technologies. Through tech intensity, companies build a strong digital foundation. In a blog post Nadella explains, “I think of tech intensity as being an equation: (tech adoption) ^ tech capabilities."

Let’s break that down further. Tech intensity is your ability to adjust your tech capability to your adoption rate of new technology. That is: it’s not just about moving to the cloud or migrating to Power BI. Tech Intensity is about using the cloud and Power BI to build custom solutions for your company or your industry.

There are three main pillars to tech intensity:
1.    Integrating technology as quickly as possible
2.    Investing in digital intellectual property
3.    Achieving trust in technology

When it comes to integrating modern tools, you don’t need to recreate the wheel. Using open-source and low code/no code (LC/NC) solutions, it’s easier than ever to improve the way you run your business. Of course, for a globally distributed company implementing a new digital initiative, keeping everyone on track can be challenging. Here, the cloud is your greatest ally. By using cloud services such as Microsoft Azure, you centralize your insights, updates, and data. With a single source of truth, you leverage one of the most important elements of tech intensity: speed. 

Tech intensity goes beyond using new technologies. You need to adapt and create solutions where you identify gaps. At MAQ Software, our expertise lies in the Microsoft stack. We build solutions in Power BI, Azure Synapse, Power Apps, Dynamics 365, and more. However, we also created several custom tools we use to strengthen our solutions. For example, our Azure migrations are faster than standard cloud migrations because our cloud assessment tool shows a complete overview of all cloud assets within an environment in less than half an hour. 

Ultimately, it’s not enough to just be fast in adopting tech. With today’s security, a company’s relationship to technology must be built on trust. This means regularly updating your privacy, security, and compliance protocols, investing in cybersecurity, and creating solutions that are accessible across your organization. Small changes can make a big difference. For example, our implementations include data dictionaries to ensure business audiences understand the meaning of metrics.

Why is tech intensity important? The world is changing fast.

“Technology moves faster than our imaginations can keep up with. We invent one breakthrough technology today and then tomorrow’s inventors transform it into another we never imagined possible.” - The Atlantic 

Go back twenty-ish years and we don’t have Facebook, OneDrive, YouTube, iPhones, or USB flash drives. The world has changed astronomically in the last two decades. According to projections, this isn’t a trend that will slow down anytime soon. This year, the United States federal government spent over $5 billion on the civilian IT budget, which will lead to better technology, which will lead to faster changes. Tech intensity is how companies can not only keep up but stay ahead of the ever-accelerating curve.

Why is tech intensity important? Data is more important now than ever.

According to several projections, the world will have 50 billion (yes billion) connected devices by 2030 and 175 ZB of data by 2025. That is a volume of data that is almost impossible to comprehend. Picture every song, movie, book, poem, short story, painting, photograph, and YouTube video ever created in the history of humanity. That adds up (roughly) to 100 exabytes. If you saved 100 back-up copies of everything humanity has ever created, you’d still only have about 1 ZB worth. 

Tech intensity enables companies to use and analyze the previously unthinkable volumes of data they work with every day. Usually, size and speed are opposites. Big data means slower processing, collection, and visualization times because, well, there’s just more to work with, right? Not anymore. This is where tech intensity really shines. Of those 175 ZB, Forbes estimates that almost half will reside in cloud environments. The technology to respond to big data is there; it’s just a matter of integrating. According to digital analyst and business strategist Brian Solis, “We are looking at a future in which companies will indulge in digital Darwinism, using IoT, AI and machine learning to rapidly evolve in a way we’ve never seen before."

Why is tech intensity important? A changing world needs a changing mindset.

Tech intensity is not just about the tech. It’s about a mindset shift towards adoption and innovation. “Simply put, technology intensity is a critical part of business strategy today.” Nadella explains, “In my experience, high-performance companies invest the most in digital capabilities and skillsets.” 

Tech intensity means democratizing tools, technologies, and business data across your organization. There is a delicate balance between ensuring key data is available and ensuring only authorized people have access to it. When you achieve that balance, you enable team members to build their own solutions to problems they’ve identified, and make critical business decisions using real-time data. Enabling and supporting democratization efforts ensures you can adopt tech quickly and build your custom solution portfolio. Tech intensity begets tech intensity. 

On the process side of things, companies that want to invest in tech intensity also need to invest in Agile. A modern mindset needs a modern process. One of the challenges that comes with the speed of tech intensity is a disconnect between company initiatives and individual skillsets. However, if you’ve done the work to refine your process, you can ensure that no one gets left behind. 

For example, when we help large-enterprise companies migrate to Power BI, it’s never just about the technical process of moving from one system to another. Yes, there are many technical parts of migration, from analyzing the existing data architecture to optimizing DAX. But one of the most important parts of our custom migration process is our Center of Excellence trainings. During our CoEs, we train team members in Power BI capabilities and best practices, ensuring they are familiar and comfortable with the new platform. These trainings speed up adoption time frames from a matter of years to a matter of months. After all, tech intensity is not a mindset reserved exclusively for leadership; it needs buy-in from every person at the company.

Why is tech intensity important? Covid-19.

If there’s anything the business world has learned from the last year and a half, it’s that systemic change that might have once seemed impossible can occur in the blink of an eye. For some, the transition to work-from-home took place over a weekend. The key differentiator between the businesses that struggled and businesses that flourished? Tech intensity. 

“While the pandemic has taught us that no business is 100 percent resilient, those fortified by digital technology are more resilient, more capable of transforming when faced with these secular structural changes in the marketplace. We call this tech intensity, and every organization and every industry will increasingly need to embrace it in order to be successful and grow.” - Satya Nadella, Microsoft Inspire 2020 keynote


Want to invest in tech intensity for your organization and not sure where to get started? Check out our consulting offers in a wide range of technologies, from Power Platform to Dynamics 365, or reach out to

Up Next

Monday, December 21, 2020

Azure Services Product Comparison

With over 50 Azure services out there, deciding which service is right for your project can be challenging. Weighing the pros and cons of each option for numerous business requirements is a recipe for decision paralysis. When it comes to optimizing your Azure architecture, picking the right tool for the job is key.

Over the last decade, MAQ Software has migrated hundreds of clients to Azure. Along the way, we’ve picked up Azure tips and tricks that enable us to develop systems that are faster, more reliable, and operate at a lower cost. In this article, we’ll explore the differences between Azure services so you can pick the one that’s right for you.

Table of Contents

Which Azure cloud storage service should I use?

Azure Data Lake Storage (ADLS) Gen 1 vs. ADLS Gen 2 vs. Blob Storage

When to use Blob storage instead of ADLS Gen 2:
When your project involves minimal transactions, use Blob storage to minimize infrastructure costs. For example: production backups.

When to use ADLS Gen 2 instead of Blob storage:
When your project has multiple developers or involves analytics, and you need to securely share data with external users, use ADLS to leverage Azure Active Directory authentication. This prevents unauthorized users from accessing sensitive data. For example: global sales dashboards.

When to use ADLS Gen 1 instead of ADLS Gen 2:
When your project executes U-SQL within Azure Data Lake Analytics (ADLA) on top of storage services, use ADLS Gen 1. This is useful for projects that need analytics and storage performed from a single platform. For example: low-budget implementations.

When to use geo-replication: Geo-replication is available on all Azure cloud storage services. As a rule of thumb, avoid geo-replicating the development environment to keep infrastructure costs down. Only implement geo-replication for production.

Which Blob storage access tier to use: Picking the optimal access tier is important to achieve your desired performance at minimal storage costs:

Tier  Key differentiator  When to use it 
Hot Optimized for frequent access to objects in the storage account  For projects that require daily refresh and frequent data transactions

Example: cloud data with daily refresh
Cool Optimized for storing large volumes of data that is infrequently accessed and stored for at least 30 days For projects that require monthly fresh with limited transactions

Example: monthly snapshots
Archive Optimized for storing large volumes of data that is infrequently accessed and stored for at least 180 days  For projects that require static data, snapshots, and/or yearly refresh storage with almost no transactions

Example: yearly snapshots
Table 1: Blob Storage Access Tier Comparison

Which Azure cloud processing service should I use?

Azure Databricks vs. Azure Synapse Analytics

When to use Azure Databricks:
When your project has multiple developers or involves analytics, and you need to securely share data with external users, use ADLS to leverage Azure Active Directory authentication. This prevents unauthorized users from accessing sensitive data. For example: global sales dashboards.

    Deep learning models: Azure Databricks reduces ML execution time by optimizing code and using some of the most popular libraries (e.g., TensorFlow, PyTorch, Keras) and GPU-enabled clusters.

    Real-time transformations: Databricks Runtime supports Spark's structured streaming and autoloader functionality, meaning it can process stream data such as Twitter feeds.

When to use Azure Synapse Analytics:
When your project uses SQL analyses and data warehousing, or reporting and self-service BI. If you need to process large volumes of data without an ML model, use Azure Synapse Analytics:

    SQL analyses and data warehousing: Synapse is a developer-friendly environment, supporting full relational data models and stored procedures, and providing full standard T-SQL features. When migrating data from on-premises to the cloud, use Synapse for a seamless transition.

    Reporting and self-service BI: Power BI is integrated directly into the Synapse studio, which reduces data latency.

Which Azure Databricks pricing tier to pick: Choosing the right pricing tier is important to reduce infrastructure costs while achieving the desired performance:

Pricing Tier  Key differentiator  When to use it 
Premium The premium tier supports role-based access to notebooks, clusters, and jobs For projects that involve mulitple stakeholders in a shared development environment

For example: one platform accessed by multiple vendors
Standard The standard tier costs ~20% less than the premium tier For projects that involve a single stakeholder and limited development teams

Example: one platform run and developed by a single vendor

 Table 2: Databricks Pricing Tier Comparison

Which Azure Databricks workload to use: Choosing the right workload type is important to ensure you achieve the desired performance at a reduced cost:

Workload  Key differentiator  When to use it 
Data Analytics Data Analytics is more flexible as it supports interactive clusters and an analytics ecosystem Suited for dev environments, which require higher collaboration and multi-user sharing
Data Engineering Data Engineering costs ~30% less than Data Analytics Suited for UAT/prod environments, which generally require less collaboration and can be run by a single developer

 Table 3: Databricks Workload Comparison

How to choose Databricks cluster size: For development purposes, start with a smaller Databricks cluster (i.e., general purpose) and enable auto scaling to optimize costs; based on specific needs, you can opt for higher clusters.

Which Azure data expose service should I use?

Azure SQL database vs. Azure Synapse (formerly Azure Data Warehouse)

When to use Azure Synapse:
When projects deal with large volumes of data (>1 terabyte) and small number of users, or use OLAP data. For example: global sales data.

When to use Azure SQL database:
When projects work with real-time data (max 1 terabyte) and many users, or use OLTP data (application database). For example: ATM transaction.

Still have questions? Feel free to reach out:

Do you want to know more about our data management best practices, projects, and initiatives? Visit our Data Management Hub to learn more.

Tuesday, December 15, 2020

Resolve Support Tickets Faster with Predictive Risk Algorithms

Business Case:

Our client, a global technology company, receives approximately 15 million customer support tickets a year. With an overwhelming volume, it was impossible for support engineers to identify which tickets were at highest risk of breaching service-level agreement (SLA). As a result, high-risk tickets could go unresolved for weeks, creating a major backlog.

At the time of partnering with us, our client had approximately 10 million open support cases in their ecosystem. Our client needed a solution that enabled support engineers to resolve tickets faster.

Key Challenges:

  Automatically assign ticket priority  
  Accurately predict high-risk cases  

Our Solution:

We implemented a CatBoost algorithm that uses machine learning to prioritize tickets based on risk probability.

Using historical ticket data, we identified the categorical and numerical attributes that our algorithm would evaluate to determine risk probability. The identified attributes included ticket type, ticket severity, initial response time, total time in queue, SLA state, and customer satisfaction score.
After comparing numerous algorithms, we concluded that CatBoost (algorithm for gradient boosting on decision trees) provided the most accurate results. Compared to other models, CatBoost:

  Allows the most categorical attributes  
  Saves time with faster model loading (does not require time-consuming one-hot encoding and pre-processing)  
  Is available at a lower cost  
  Requires a minimal learning curve  

To launch the model, our team used data outliers to inform the algorithm. With CatBoost’s machine learning capability, the algorithm increases its accuracy over time.
Figure 2: Mobile Notification

To improve support engineers’ response time, we implemented an integrated web feature that sends high-risk ticket notifications to engineers’ cell phones.

Business Outcomes:

Our CatBoost algorithm implementation enabled support engineers to resolve high-priority tickets faster. The algorithm uses machine learning to accurately assign ticket priority based on comprehensive risk evaluation. Prior to our solution implementation, our client had a backlog of 10 million tickets. Now, support engineers can reduce their backlog while keeping up with incoming tickets. With our added cell phone notifications, support engineers can immediately respond to high-risk tickets and ensure SLA compliance. Overall, our solution has enabled our client to significantly improve customer satisfaction.


    Implemented a CatBoost algorithm that automatically prioritizes tickets based on risk probability
    Reduced ticket resolution time
    Added a cell phone notification for high-risk tickets

For more on customer support, check out how our custom Dynamics 365 portal improved our client's gaming support operations

Thursday, December 3, 2020

Create a Single Source of Truth with Dynamics 365

Business Case:

Our client leases data centers across the globe. To maintain their leases, our client coordinates with up to 100 lease providers, which involves multiple methods of communication. In addition, a single lease can require as many as 20-30 documents.

Previously, our client tracked center details across multiple different spreadsheets. Quotations, site progress, and updates were managed over email and through notations from in-person visits. Our client struggled to consolidate data and track updates. Our client needed a centralized, easy-to-maintain solution that provided one source of truth for global information.

Key Challenges:

  Consolidate assets into a single location, including emails, documents, and meeting notes 
  Integrate existing SharePoint system with the solution  
  Enable users from different regions to easily access global data center details such as sites, lease providers, and geographies  

Our Solution:

We created a Dynamics 365 portal that consolidates and presents details from email, Excel spreadsheets, OneNote, and SharePoint Online.  
Figure 1: Solution Design

Our solution uses Dynamics 365 App for Outlook to link to our client’s primary mail system. Users can track Outlook emails and appointments directly from the Dynamics 365 environment. The portal consolidates all global data center information, which can be accessed from different geographies.  

During our Outlook implementation, we identified an opportunity to enable our client to manage data directly in Dynamics 365. Understanding the value in this capability, our client expanded the solution scope. We added entities (tables in the Dynamics 365 environment) to replace our client’s existing Excel records. Dynamic 365’s built-in auditing capabilities allow our client to track edits in real time. Users can also import Excel spreadsheets directly into the portal rather than manually inputting information. We added an extension that enables users to upload OneNote and SharePoint Online files to the Dynamics 365 portal.  

We implemented role-based security to protect our client’s sensitive data. There are three main roles: 
1.  Admin: Can access, update, create, and delete all records. 
2.  User: Can access, update, create, and delete any records within their specific territory/project. 
3.  Reader: Can view all records, but does not have editing permissions. 

Our solution uses the Dynamics 365 export service to auto-synchronize our client’s SQL database. The SQL database acts as an import model for Power BI reports that generate key leadership insights. To ensure data is validated securely from end to end, we implemented Power Automate flows. If the uploaded data contains errors, our solution sends an email notification to managers, who can take immediate corrective action.  

Business Outcomes:

Our centralized Dynamics 365 platform enabled our client to track appointments, emails, documents, and notes across the entire team. By developing a single source of truth, our client avoids miscommunication and rework, reducing data management efforts by 60%. Maintaining data sets is easier, as all information can be accessed from one location. With role-based security and edit tracking, our client can monitor access and update permissions.  

We integrated data from multiple sources, which enables field workers operate seamlessly. For example, a field worker might deal with a data center that operates out of Paris. All team members can view emails and appointments from our centralized platform. If the field worker then visits the Paris data center and records meeting notes on OneNote, these can also be uploaded directly to the Dynamics 365 portal. All relevant data will be automatically associated with the Paris data center records. 

Our solution reduces data management and tracking efforts by 4-6 hours a week. Since implementation, our portal has supported multi-channel communication between our client’s 50 team members and the nearly 100 data center providers around the world.


    Developed a Dynamics 365 portal that centralizes data across emails, Excel spreadsheets, OneNote, and SharePoint Online
    Eliminated manual tracking, reducing data management efforts by 60%
    Enabled seamless communication between 50 team members across the globe, reducing efforts by 4-6 hours a week
    Simplified data maintenance by enabling users to edit data directly in the portal
    Implemented role-based security

Thursday, November 19, 2020

Engineer Product Features in Real Time Based on Customer Feedback

Business Case:

In today’s consumer-centric environment, the best way to improve a product is by responding to customer feedback. A typical customer feedback cycle requires cross-team communication between support, sales, and engineering. The sales team offers insights into product features that receive recurring feedback; the engineering team has the technical expertise to implement changes and offer support. Responding quickly to feedback is critical to ensure customer satisfaction.

Our client, a global technology giant, stores support ticket information in Azure DevOps. Our client’s sales team receives thousands of support cases a day, but only converted 600 cases a year into engineering features. Without an effective classification and communication system, feedback went unresolved. Although the sales team was logging customer feedback, they couldn’t connect with the engineering team, who often didn’t know the feedback existed. In partnering with us, our client needed to bridge the gap between their sales team and engineering team.

Key Challenges:

  Define a feedback triage process for the engineering team 
  Effectively share feedback between teams  
  Create a reporting system where engineers can access all relevant feedback data  
  Reduce feedback response time 

Our Solution:

We developed a two-part solution to connect our client’s sales and engineering teams: 
1.  Establish a feedback triage process 
2.  Create a Power BI dashboard that pulls feedback information from the engineering DevOps 

Figure 1: Solution Design

The triage process begins by pulling customer feedback from two different sales portals into a single DevOps workspace. During level one triage, the engineering team verifies the feedback Work Item is complete with pre-defined metrics such as impact, customer scenario, and description. If the Item lacks key information, the team sends it back to the requestor.  

As part of level two triage, the engineering team meets weekly to assess feedback Items. If an Item needs additional information, the team sends it back to the requestor. Otherwise, the team tags the feedback as “Evidence” and attaches it to an engineering Feature. Engineers respond to the Features directly through DevOps. One Feature might contain dozens of Evidence Items based on repeat customer feedback. To ensure Features aren’t duplicated, we created a separate table in an Azure Databricks notebook that maps new feedback with existing Features.  

Figure 2: Engineering Feedback Dashboard Home Page (using sample data)

With a triage process in place, we then developed a dashboard where engineers could view and prioritize all open Features. The dashboard pulls data from DevOps using API calls in Azure Databricks. Our solution uses custom calculations to derive key metrics from the data. Once transformed, data is pushed to Azure SQL tables that populate the Power BI dashboard. The dashboard includes information such as revenue impact and customer profile so engineers can effectively prioritize feedback response. Engineers can group data by sector, product type, country of origin, and many other metrics using the dashboard’s multiple tabs.  

Figure 3: Feedback by Revenue Impact Drilldown (using sample data)

Business Outcomes:

Our engineering triage process and Power BI feedback dashboard enable our client to close the gap between sales and engineering. With an effective triage process, the engineering team can quickly access relevant data. To increase response speed, we group repeat queries under a single Feature. With numerous impact views, the dashboard enables engineers to effectively prioritize tasks. Customers receive faster responses to high-priority feedback.  

Increased accountability improves the team’s ability to collaborate on technically complex feedback. Our client can continually enhance products and services based on customer requirements. Effective feedback response enables our client to build long-lasting customer relationships and identify potential revenue opportunities.  

Since implementation, our client has created over 1K feedback Items, increasing DevOps usage by 66% in under five months. Over 300 of these items have been converted into Features for the engineering team to work on. The dashboard is used across the engineering team, with over 1K views each month.


    Developed an engineering triage process and Power BI dashboard that enable the engineering team to respond to customer feedback
    Increased use of the engineering DevOps workspace by 66%
    Enabled engineers to prioritize customer feedback with a dashboard that has over 1K monthly views
    Reduced feedback response time

Monday, November 16, 2020

Manage and Share Events with a Custom Calendar App

Business Case:

Our client, a global technology company, organizes thousands of marketing events each year. To launch an event, marketing managers need to add the event to the calendar, record details, and present it to the planning team using PowerPoint. Due to our client’s massive volume of data and various capability requirements, managers needed to use separate platforms to complete their tasks. With multiple data sources, managers were unable to completely view and filter events to ensure effective execution.

Unfortunately, no single platform currently on the market provides the capabilities and data capacity our client requires. In partnering with us, our client’s goal was to design a centralized calendar solution that enables managers to access and export event information from a single location.

Key Challenges:

  Design a centralized calendar capable of storing and filtering massive volumes of data 
  Enable event details to be exported to PowerPoint  

Our Solution:

We built a Power Apps-based calendar that enables our client to view, edit, add, and export up to 5,000 events. 

Figure 1: Solution Design

Out-of-the-box Power Apps storage is limited to 2,000 records. We wrote custom code that iteratively fetches records from SharePoint, enabling the calendar app to store and display up to 5,000 events. Increasing storage capacity, however, introduced a challenge: more records increased page load time. To improve page load time, we implemented collections in Power Apps. On its initial run, the app saves data as a collection, reducing the run time of subsequent data calls. 

Users can log in to the app through Azure Active Directory. For enhanced security, we implemented role-based access by creating security groups in Office 365. To view calendar events, users need basic access. Calendar view is customized based on the user’s security group role. To add and edit calendar events, users need administrator access. All events and details are stored in a SharePoint list. Both basic users and administrators can select and export events and details directly to PowerPoint. 

As an additional feature, we enabled users to share event links on platforms such as Twitter, Outlook, and LinkedIn, with the simple click of a button.

Figure 1: Calendar General Page

Business Outcomes:

Our Power Apps-based calendar app enables our client to easily add, edit, and view an unprecedented volume of events. With our customizations, the Power App can display up to 5,000 event records – 3,000 more than the default capacity. Despite the capacity increase, we implemented initial data collections that reduces page load time by 90%. 

With a centralized platform to detail and filter events, marketing managers spend less time searching for and consolidating information, and more time effectively executing events. Now, managers can easily export and present event details to global business teams with our export to PowerPoint feature. In addition, we enabled managers to easily share events with prospective customers through major platforms like LinkedIn, Outlook, and Twitter, increasing our client’s sales opportunity.


    Built a Power Apps-based calendar that enables users to view, edit, and add up to 5,000 event records
    Reduced report page load time (PLT) by 90%
    Enabled events to be exported to PowerPoint
    Enabled events to be shared on major social/enterprise platforms

Event management is critical before and during the event. Check out how our Power Apps-based solution easily tracks attendance and feedback for major events

Thursday, November 12, 2020

Easily Embed Power BI Content into ReactJS Apps with an Open Source Wrapper

Microsoft’s Power BI is a premier data visualization tool. Most of our Fortune 500 customers use Power BI for enhanced reporting and analytics. Power BI Embedded is an Azure Service that enables developers to embed Power BI content (dashboards, reports, tiles, and visuals) into an application.

ReactJS is the most popular frontend framework for building user interfaces (UIs). Within the ReactJS community, there is high demand to easily embed Power BI content. Prior to this solution, users needed to maintain both the ReactJS app and Power BI embedding lifecycles, leading to time-consuming rewriting of code. Driven to adopt and innovate the latest technologies, the Microsoft Power BI Embedded product team partnered with us to create an open source solution.


We assisted Microsoft’s Power BI Embedded product team to develop an open source wrapper that enables users to easily embed Power BI content into React-based applications. The wrapper consists of a component library called powerbi-client-react, which eliminates the need for writing large amounts of code.

The wrapper was implemented in TypeScript and is compiled to JavaScript. You can consume the wrapper from NPM or find the source code on GitHub.
Figure 1: Solution Design

Use Cases

    Embed Power BI content into a React-based application
    Reference embedded content
    Apply cascading style sheets (CSS) class style
    Set and reset event handlers
    Set new access token after expiry
    Update embed settings
    Bootstrap Power BI content
    Author Power BI reports
    Phased embedding

To see the wrapper in action, check out the DEMO on GitHub.


    Reduces development effort by approximately 20%
    Lightweight: 39 kB (unpacked)
    Supports embedding of all Power BI content
    Supports all features of Power BI Embedded
    Provides smart props that manage the complete embedding lifecycle
    Wrapper components can be used as independent plug and play components
    Compatible with Power BI report authoring library, enabling an application to edit the report from the embedded report
    Hosted on NPM and open sourced on GitHub under MIT License

At the time of publishing this article, the powerbi-client-react wrapper has 2,216 weekly downloads on NPM, and has 77 “Stars” and 12 “Forks” on GitHub.