Wednesday, June 7, 2023

ADX Implementation in the Energy and Utilities Industry

 









The need for real-time insights and scalable solutions  

Our client, a leader in the energy and utilities sector, offers secure and innovative IoT solutions that fill crucial gaps in the industry. They provide devices, networks, software, and services that have been proven effective in challenging environments across the globe. 

Our client needed help extracting real-time insights from large volumes of data generated across diverse applications. To maintain a competitive edge, our client sought a scalable solution with enhanced data latency and accelerated data processing, enabling efficient management of the substantial data volume.

The ask

      •  Stage IoT data and consume it in Power BI reports. 
      •  Reduce the time taken to refresh Power BI datasets. 
      •  Handle dynamic schema changes in incoming data. 

Addressing the challenges 

After exploring various techniques using MongoDB, Cosmos DB, Synapse, and SQL Server, our client encountered obstacles while attempting to address the data analytics needs. Recognizing the limitations of these approaches, they turned to our expertise for a robust and effective solution. 

Following an extensive analysis, we identified the following solutions to effectively overcome the primary challenges: 


Figure 1: Key Technologies

      •   Ingest and transform semi-structured JSON data obtained from network logs
We developed a script to ingest and stage data in Azure Data Explorer (ADX) while ensuring integrity and reliability. The script appends new data, handles dynamic schema changes, and logs all the data staging steps. This approach could easily accommodate data with a new schema into the existing table without data loss. Setting up partitioning and merge policies on the table helped improve ingestion time. It also empowered us to effectively manage a year's worth of data in Power BI, facilitating comprehensive analysis. 
 
      •   Ability to handle large volumes of data in Azure Data Explorer (ADX)
We leveraged Power BI Direct Query mode to utilize the computing power of the ADX cluster. It handled large volumes of data which were difficult to achieve when using MongoDB, Cosmos DB, Synapse, or SQL Server.  

Our winning solution 

We successfully harnessed the power of Power BI and Azure Data Explorer (ADX), to analyze and visualize semi-structured upstream data. Our solution not only facilitated efficient data ingestion and transformation but also effectively tackled issues pertaining to data refresh and dynamic schema of data. 

Our client needed help extracting real-time insights from large volumes of data generated across diverse applications. To maintain a competitive edge, our client sought a scalable solution with enhanced data latency and accelerated data processing, enabling efficient management of the substantial data volume.

Figure 2: Impact

Flow diagram

      •  JSON files are present in the local system.  
      •  Data will be ingested into the temp table in a single column using a Python ingestion script.  
      •  Details will be written to the Main table ingestion failure logs in case of any failures in the ingestion.  
      •  Data will be populated into derived tables using functions in ADX.  
      •  The status of the derived table ingestions will be written to Derived Table Ingestion logs.  
      •  Power BI report will follow the following data-sourcing approach:  
      •  Data will be sourced from ADX using a composite model.  
      •  Larger tables will be sourced through DirectQuery mode.  
      •  Smaller dimension tables will be sourced through Import mode. 

 

Figure 3: Solution Architecture




Friday, June 2, 2023

Modern Data Analytics using Microsoft Fabric



The launch of Microsoft Fabric for public preview is here! 

Our journey with Fabric's product engineering teams has been a thrilling six-month experience. Throughout this journey, we had the opportunity to evaluate various customer scenarios and share our insights with the Microsoft team. 

Being part of realizing the product vision has been a valuable experience in our longstanding partnership with Microsoft. It has been an honor to support the Microsoft team on their transformative journey.  

In this next phase, we eagerly look forward to leveraging Microsoft Fabric to empower our customers. Fabric's unified platform offers immense potential to transform the way our customers approach data analytics. Fabric offers a seamless experience for data storage, access, security, and reporting management, with more features coming soon. 

We are excited to be at the forefront of this groundbreaking initiative and remain committed to driving innovation—through the power of Fabric. 


Data modernization

Microsoft Fabric empowers organizations to achieve more with their tracked data through easier, more secure, and cost-effective management. With Microsoft Fabric, you will have a comprehensive data platform that enables you to:  

1. Avoid data silos, data duplication, and vendor lock-in  
One copy of data is stored and referenced in OneLake—a SaaS, multi-cloud data lake—throughout data engineering, analytics, ML insight generation, and reporting processes. 

2. Significantly reduce costs, improve collaboration, and simplify purchasing  
Microsoft Fabric uses a simplified single licensing model for all data needs, reducing the overall cost of ownership. 

3. Accelerate time to market  
Integration of technologies and AI Copilot make Microsoft Fabric the fastest way to develop and get insights from data. 

4. Foster a data-driven culture within your organization via data democratization  
Open AI integration enables business users to generate more value from their data. 

5. Define and manage data security and lineage in a single place (OneLake)  
Fabric provides a standard security compliance model for your data, all in one place. It enables easy tracking of data lineage across processing and reporting scenarios for a comprehensive understanding of data flow. 


COMETS framework 

Through our decades of experience, we have created a framework to guide our processes. Our COMETS framework combines expertise, best practices, and modern techniques to help customers maximize their business opportunities. Microsoft Fabric makes it easy for organizations to take the next step in their modernization journey and enables all the principles identified in our COMETS framework. 

Customer-centric: Fabric provides a unified experience and architecture that caters to the specific needs of various roles in the analytics process. This includes data engineers, data warehousing professionals, data scientists, data analysts, and business users.  

Optimize: Fabric offers a lake-centric approach with OneLake, simplifying the creation, integration, management, and operation of data lakes. OneLake eliminates data duplication and promotes data discovery, sharing, governance, and compliance. Fabric also provides compatibility with Azure Data Lake Storage Gen2. The consolidation of services allows you to manage all your resources—from storage to computing and reporting—under 1 capacity. 

Modernize: Fabric embraces open data formats, treating Delta on top of Parquet files as the default format for all workloads. This format enables customers to work with structured and unstructured data, reducing the need for separate copies of data for different analytic uses. 

Empower: Fabric leverages AI capabilities through Azure OpenAI Service, allowing users to unlock the full potential of their data. Copilot in Microsoft Fabric enables conversational language interactions for creating dataflows, generating code, building machine learning models, and more. 

Transformative: Fabric deeply integrates with Microsoft 365 applications, such as Power BI, Excel, Teams, PowerPoint, and SharePoint. This integration enables users to discover, analyze, and apply insights directly from their everyday tools. It also fosters a data culture where everyone can make better data-based decisions. 

SAAS-ification: Fabric provides a unified architecture that encompasses all aspects of data management. This allows organizations to design and implement reusable frameworks that can be applied across various analytics projects. Seamless integration with popular analytics tools and libraries, such as Python, R, and Apache Spark, enables organizations to leverage their existing codebase and libraries. Fabric allows users to create customizable templates for common analytics tasks or workflows. These templates can include pre-configured settings, data transformations, and analytics algorithms, making it easier to replicate and reuse them across projects. 

---------------------------

For demos, workshops, and getting insights on how to pilot Microsoft Fabric within your organization, please reach out to Sales@MAQSoftware.com

You can read more about Microsoft Fabric at aka.ms/fabric

Friday, March 24, 2023

Everything You Need to Know About Migrating to Power BI


What is Power BI?

Power BI is a business intelligence (BI) tool that enables users to easily transform their data into actionable insights. Business intelligence refers to the collection and analysis of business operation data. Insights from this data enable business leaders to identify growth opportunities and close operational gaps.

Previously, BI reporting platforms required developer expertise. Today, anyone can develop intuitive, insightful dashboards and reports with the right tool. Power BI is a powerful, easy-to-use tool that offers a wide range of storytelling visuals that help you understand your business opportunities. 

Power BI Migration Strategy

Large-enterprise companies rely on insights from BI platforms. When companies use unoptimized platforms, insights are slow, inaccurate, and siloed, delaying business-critical decisions for weeks. The most common issues our clients face with other BI platforms are:
1.    Slow-loading pages and reports
2.    Difficulty managing and maintaining multiple data sources
3.    Decentralized reporting
4.    High maintenance costs

Increasingly often, large-enterprise companies are turning towards Power BI. Power BI’s centralized, dynamic reporting better addresses their real-time business needs. However, migrating large volumes of data from enterprise systems can be challenging.
Managing terabytes of data and training thousands of team members in a new system requires meticulous planning. That’s where we come in. After leading over 100 Power BI migrations for large-enterprise companies, and implementing over 8,000 Power BI solutions, we’ve developed a simple six-step migration strategy. With our strategy, you can rest assured that we’ve covered all the bases, enabling you to migrate seamlessly to Power BI.

1. Requirement Gathering and Analysis

Before we start actually moving data, it’s important to understand your current landscape. This means evaluating the existing reporting platform to understand your current needs, key functionalities, and gaps. We examine reports, dashboard usage, UI/UX, audiences, data sources, and security to create a report inventory and data estate. This information determines your migration scope, performance requirements, and complexity.

2. Planning and Design

Now that we understand your existing landscape, it’s time to move onto developing a road map. This sets the stage for the migration’s success. As Antoine de Saint-ExupĂ©ry once said, “A goal without a plan is just a wish.”

Here, we propose a solution based on all the requirements gathered in step one. To ensure everyone agrees with the plan of action, we set up a proposal meeting that involves architects, data administrators (admins), infrastructure admins, legal and security teams, and the Power BI product team (if required).

In general, we divide planning and design into five sub-steps:

1. Perform a detailed gap analysis to identify the different features, visualization, and modeling challenges we need to address during migration
2.  Propose a Power BI architecture, including security implementation, refresh latency, and report performance
3.  Design report templates and prepare mock-ups
4.  Define the scope for automated validation
5.  Propose a deployment strategy and implementation timeframe

3. Execution

Now, it’s time to implement the approved solution architecture. Because we spend so much time on the planning stage, this step is straightforward. To optimize our workflow, we follow the agile methodology with the following steps:

1.  Sprint plan: Create a product backlog and define the scope and length of sprints
2.  Implementation: using best practices, reusable templates, and themes, start the report migration and provide incremental report builds
3.  Performance tuning: Refine the architecture and report layout to optimize the data model for performance.
4. Testing: Use a set of in-house performance analysis tools to automate testing, which tracks query performance and suggests visual layout and data validation optimizations
5.  Deployment: Close our sprint by automating report deployment and readying the build for user acceptance testing (UAT)

4. Deployment and Post-Production

During this step, we ensure the new reports are user-friendly and high-performing. First, we conduct numerous UAT sessions. UAT ensures the reports are optimized for their target audience. Once we receive sign-off for UAT and production, it’s time for deployment. We automate deployment, giving end users immediate access to the reports. To complete the transfer of ownership, we hand off the code, report, and workspace inventory to our client.

For many companies, Power BI migration ends here. However, we believe that successful adoption is a critical part of migrating to Power BI. That’s why we dedicate the next two steps to post-migration success.

5. Center of Excellence (CoE)

According to Microsoft, “A Center of Excellence (CoE) drives innovation and improvement and brings together like-minded people with similar business goals to share knowledge and success, while at the same time providing standards, consistency, and governance to the organization.”

During our CoE trainings, we enable our clients to become self-sufficient Power BI users. We run numerous CoE sessions that train team members across your organization in Power BI capabilities, governance, and best practices. These enable technical users, business users, and operations team members to become familiar with the new data system as the old system is gradually moved offline. Our custom trainings include regular office hours with certified engineers, an advanced curriculum, and pre-built solutions and frameworks. On average, our CoEs shorten the Power BI adoption timeframe from years to months.

If you are already at this migration stage, or need some help boosting Power BI adoption, check out our virtual CoE trainings, offered to any organization year-round:
  Admin CoE 

6. Decommissioning

There’s nothing worse than a cluttered data system. To avoid redundancies, we systematically retire old reports. Here, our main goal is moving you onto the new system without impacting ongoing business operations. At MAQ Software, we believe migration to Power BI should be seamless.

Figure 1: Complete Process Overview

Benefits of Migrating to Power BI

By migrating to Power BI using our six-step approach, our clients have benefitted from:

Quicker Insights for Decisionmakers
  Reduced latency between data sources and reports  
  Increased scalability 

Self-Service BI
  Business users can create reports and customize dashboards without developer expertise  

Centralized Reporting
  Admins can easily manage and govern their organization’s reports with centralized administrative capabilities 
  Users can combine data from different sources, such as Dynamics 365, Office 365, and hundreds of other relational databases
  Increased accuracy by offering a single source of truth for all reporting layers through shared datasets 

Power BI Migration Case Studies

As the 2021 Microsoft Power BI Partner of the Year, we have experience migrating clients from a wide variety of data visualization platforms to Power BI. Our expertise enables us to easily manage large volumes of data and enable business continuity throughout the migration process. Here is a sample of how we’ve empowered our clients to migrate to Power BI.

Tableau to Power BI

Client: An international fast-moving consumer goods (FMCG) company.

Situation: Our client wanted to centralize their reporting platforms by migrating from Tableau to Power BI. As their existing Tableau reports were developed over time, it was complex to migrate them without compromising functionality.

How We Did It: We discussed each report in detail to understand its underlying business purpose. Then, we used our knowledge of Power BI to identify the best methods of achieving the same results in a new system. Spending time with the actual report users gave us insight into end user flow, enabling us to design an intuitive Power BI report.

Results: We migrated over 250 Tableau workbooks to Power BI. The new reports were better organized and decluttered. With easy navigation and optimized design, the new reports achieve the same functionalities as the old ones, with increased performance and accessibility. Our Center of Excellence trainings also helped increase post-migration adoption by 300%.

Qlik to Power BI

Client: EF Education First, a global education foundation with offices in 50 countries.

Situation: EF Education First needed a modern reporting platform with self-service analytics, easy scalability, and low operational costs.

How We Did It: We performed a gap analysis of the features and visualizations in Qlik and Power BI. Qlik supported 16 reports, with a data source consisting of 20+ SQL databases and 30 Excel sources. We ensured all required data could be transferred and visualized per the client’s needs.

Results: Power BI’s low-code architecture and cloud-based centralization, gave EF Education First access to self-service scalability.

Find out more about our QlikView to Power BI migration

SAP Business Objects (SAP BO) to Power BI

Client: A multinational food, snack, and beverage corporation with 250,000+ team members.

Situation: With our client’s high volume of data, their existing SAP BO reports took over five minutes to load. Running many slow-loading reports takes up the team’s valuable time, negatively impacting business operations.

How We Did It: We implemented a tabular model with Azure Analysis Services (AAS) to enable fast, efficient reporting in Power BI. Data loads from our client’s existing Teradata storage into AAS. For users with alternate view and calculation needs, reports can be exported directly from AAS to Excel. AAS is more equipped to store the huge models and pre-aggregated data needed for real-time visualization. AAS provides a dedicated central processing unit (CPU) and memory, independent of the load on premium capacity.

Results: Migrating from SAP BO to Power BI reduced reports run time by 90%. Previously, reports could take up to 5 minutes to load. With our solution’s back-end Azure Analysis Services (AAS), dense data now loads into Power BI in less than 20 seconds. Users can rapidly customize and run reports without the wait. AAS also has a built-in feature that provides time intelligence for KPIs on the fly.

MicroStrategy to Power BI

Client: A global Fortune 500 retailer.

Situation: Our client sends weekly and monthly report snapshots to subscribed internal and external users. MicroStrategy offers an easy and intuitive method to share reports like this. However, our client had recently migrated their other systems to Power BI as it offered better long-term scalability. To reduce costs, our client wanted to consolidate all functionalities to a single platform. We needed to implement a similar export/subscription functionality using Power Platform.

How We Did It: We used the existing subscription list and created a security model works with Power Automate schedules. Then, we converted data tables in MicroStrategy to paginated reports in Power BI. Using the Export API, the data can now be exported as an attachment to share with external and internal users.

Results: We helped our client retire their outdated MicroStrategy reports without losing their easy sharing capabilities. Because Power BI is part of the Power Platform, it integrates seamlessly with other powerful tools, such as Power Automate and Power Apps. Now, our client can view dashboards, manage reports, and share insights using a platform that is both scalable and sustainable.

Looker to Power BI

Client: A leading retail firm that provides office supplies. 

Situation: Our client sought a centralized (BI) platform that delivers low operational and maintenance costs while providing self-service analytics capabilities. They also required seamless migration from their on-premises data source to a cloud-based one. 

How We Did it: Our team established a centralized Power BI dataset by importing data from a cloud-based source. To optimize query performance and minimize costs, we implemented custom partitioning and incremental refresh policies in Power BI. By doing so, we reduced the overall number of queries fired to the cloud-based source. Our solution also met the customer's requirements for data refresh latency, ensuring that the dataset was always up-to-date and readily available for analysis. 

Results: We assisted the client in retiring their Looker reports and migrating to Power BI, empowering end-users with self-service reporting capabilities. With Power BI's user-friendly interface, users can easily customize their report views and gain valuable insights. Power BI's built-in export functionalities also enable users to seamlessly share their findings with others, making it a more collaborative and efficient tool for the client's reporting needs. 

Cognos to Power BI

Client: A global service provider in the Health, Tax & Accounting, Legal & Regulatory, and Finance industries. 

Situation: Due to our client's high volume of reports, their existing Cognos reporting system had a high cost per click—on top of having limited UI features. This drawback led to a downward impact on business operations. 

How We Did It: We implemented a tabular Microsoft SQL Server Analysis Services (SSAS) model that allowed for fast and efficient reporting in Power BI. The data from the client's existing data warehouse was loaded into SSAS, which is better equipped to store large models and pre-aggregated data needed for real-time visualization. With SSAS as the backend, reports can be generated directly from Excel for users with business priority and calculation needs. Additionally, SSAS provides a dedicated CPU and memory, which further optimizes the reporting process. Powerful features such as Export, Subscribe, and User Management (which can restrict users with lower privileges from publishing reports to the workspace) can easily be customized and managed using Power BI Report Server. 

Results: Migrating from Cognos to Power BI reduced the cost per click by ~50%, while the aesthetically appealing visuals also improved the usage of the reports. Our solution allows for dense data to load into Power BI in less than 3 seconds, allowing users to rapidly customize with a better UI and run reports without delay. With SSAS, there is a built-in feature that provides time intelligence for KPIs on the fly, which further enhances the reporting process. 

***

While our six-step migration strategy provides a general framework for success, each organization’s needs are different. Need help getting your Power BI migration on track? Partner with us by emailing Sales@MAQSoftware.com .

Originally posted June 16, 2020

Up Next

To further improve your Power BI performance, check out our Power BI Best Practice Guide.

Wednesday, February 22, 2023

Transforming Customer Support with AIOps-Driven Power BI Embedding

Need for a transformation

In today's rapidly evolving networking industry, our client, a leading company in the hardware and software sector, realized the importance of a transformative approach towards customer support. With the proliferation of data and the increasing complexity of networks, customers require real-time visibility into their network performance, which can be challenging to deliver with traditional reporting methods. 

By adopting an AIOps-driven approach and providing fully integrated and customer-accessible operational dashboards and reports, our client aimed to enhance their customers' experience by delivering actionable insights and analytics at scale. With this, the company required a reporting visualization solution that aligned with their business and technical requirements and provided an exceptional user experience for both customers and internal users.

The ask

·         Embed a reporting solution (Power BI) in their Salesforce portal; can scale to support 500 ISV customers (approximately 13 times the current usage). 

·         Migrate the current Tableau solution to Power BI.

·         Integrate the solution with the backend for data refresh.

·         Automate new customer provisioning. 

·         Deploy templatized reports for new customers while ensuring that only relevant information is displayed and that security is maintained.

·         Enable self-service capability for ISV customers.


Figure 1: Solution Architecture

Tackling the task

To address the client's requirements, we performed an assessment and proposed an architecture that aligned with their business and technical needs. The team also had multiple calls with their Salesforce, Snowflake, and Power BI teams to further define the architecture and ensure seamless integration.

To ensure a secure integration of Power BI with the current Salesforce portal, we started by evaluating the portal's capacity to handle the expected system load and user distribution. Our team then defined the necessary system calls to enable interaction with the backend system and Power BI. To meet the solution requirements, we provided a scalable compute solution in Azure. All in all, this ensured that the integration was not only secure but also optimized for performance.

After conducting a cost analysis based on the user base and shared options for the Power BI capacity, a POC that provided firsthand experience of the end product was showcased, displaying how Power BI would be embedded in the Salesforce portal.

Communication is key

Throughout the span of the project, communication was an integral element between MAQ Software and the client. Effective communication was instrumental in identifying dependencies early in the development cycle and proved to be even more crucial when roadblocks were encountered. One such challenge occurred during the implementation phase when the backend Snowflake framework moved to a new version (v2). Our team swiftly addressed customer concerns and proposed a plan to move forward. By evaluating the changes, we were able to adapt and update the reporting solution data model, keeping on track with our delivery of a stellar solution.

Figure 2: Delivered Items


The solution and outcome

Ultimately, our final solution included a Salesforce embedded component to securely show Power BI reports, around 30-40 customer-specific ISV reports, an automated customer provisioning pipeline to set up reports/workspaces and deploy template reports, an automated release pipeline to deploy new features to all customers, and a subscription solution to configure and send report subscription emails to users in PDF/PPT formats.

By embedding Power BI into their reporting solution, our client was able to significantly enhance their customer support experience, providing an intuitive, real-time view of their customers' network performance and reducing the maintenance overhead. The new solution streamlined their reporting process, reduced turnaround time for report creation, and provided an exceptional user experience for both customers and internal users. 


Figure 3: Impact