July 18, 2025

Enhancing ISV reporting with an SAP BO to Power BI migration


Overview

A leading global provider of professional information, software solutions, and services embarked on a large-scale initiative to modernize its reporting infrastructure. The goal was to unify legacy reporting systems into a cloud-first, interactive analytics environment using Microsoft Power BI.

This initiative, known internally as the Unified Reporting Platform (URP), was driven by the need to:

·       Consolidate approximately 12,000 legacy reports and standardize KPIs previously fragmented across multiple platforms.

·    Leverage modern BI features to enhance interactivity, scalability, and user experience.

·    Provide out-of-the-box, centrally governed reports maintained by IT.

·    Empower business users with self-service analytics via a curated and governed semantic model.

·    Lay the foundation for future integration with an advanced data lake architecture leveraging Microsoft Fabric.


Challenges

Before the migration, the client faced several pain points across their legacy reporting tools (SAP BusinessObjects, Tableau, and Cognos):

·       Duplication and inconsistency: Over 12,000 reports existed across different platforms, leading to redundant efforts and mismatched KPIs.

·    Rising costs: Maintaining multiple tools resulted in high licensing, infrastructure, and support expenses.

·    Limited interactivity: The legacy systems lacked modern features such as drill-downs, dynamic filters, and personalized views.

·    Performance issues: Reports suffered from long load times and frequent timeouts during peak usage.

Our approach

Phase 1: Discovery & Assessment

Using our proprietary MigrateFAST accelerator, we performed a comprehensive assessment of the client’s SAP BO environment:

·       Extracted report metadata and usage statistics to identify duplicates and underutilized assets.

·    Clustered reports by business subject areas and visual similarities.

·    Developed a rationalization matrix to consolidate similar reports into broader, multifunctional Power BI dashboards.

Outcome: Reduced over 12,000 legacy reports to 6,500 optimized reports, ensuring KPI consistency and improved usability.


Phase 2: Data Modeling & Architecture Design

·     Identified essential fact and dimension tables to support key subject areas like Spend, Vendor, Matter, and Timekeeper.

·     Developed SQL scripts and stored procedures for data cleansing, transformation, and rule enforcement before loading into Power BI.

·     Designed a hybrid storage strategy in Power BI:
        - Import mode for high-performance datasets
        - Direct Query for large, real-time datasets

·     Applied star schema principles to define relationships, ensuring optimal performance and accurate reporting.

·     Built a centralized semantic model with reusable KPIs, date tables, and calculation groups to maintain consistency across reports.

·     Implemented dynamic Row-Level Security (RLS) using DAX and user-role mappings to enforce governed data access.


Phase 3: Report Development:

·    Designed consolidated Power BI reports to replace fragmented legacy reports.

·    Built standardized templates to ensure consistent UX/UI.

·    Implemented features like interactive drill-through, dynamic tooltips, and the “Personalize this visual” functionality.

·    Optimized DAX measures and used pre-processed views to accelerate load times.

·    Developed reusable themes, templates, and a DAX dictionary for future scalability.

·    Employed Git-based version control with CI/CD pipelines through Azure DevOps for seamless deployment.


Phase 4: Testing & Validation:

·    Conducted side-by-side validation between SQL outputs and Power BI reports.

·    Engaged business analysts to certify accuracy.

·    Simulated user roles to rigorously test RLS implementation.

·    Performed load testing to ensure responsive performance under production conditions.

Figure 1: 6-step migration process

Impact

·       Report consolidation: Reduced ~12,000 reports to ~6,500 using MigrateFAST clustering and similarity analysis.

·    Efficiency boost: Merged 100+ reports into 40 standardized, out-of-the-box Power BI dashboards.

·    Governed self-service: Governed self-service

·    Modern analytics experience: Introduced personalized views, responsive drill-downs, and consistent styling across reports.

·    Improved adoption: Standardized visual design reduced training needs and promoted widespread adoption.

Interested in learning more?

Contact us at CustomerSuccess@MAQSoftware.com to learn how MAQ Software can help modernize your reporting infrastructure.

July 17, 2025

Delivering embedded Power BI reporting with EmbedFAST


Customer overview

A global leader in professional information, software solutions, and services across regulated industries sought to enhance their enterprise-wide embedded analytics platform. Built on Microsoft Power BI, this Unified Reporting Platform (URP) serves both internal teams and end users, delivering actionable insights through interactive dashboards and reports.


The customer aimed to expand the platform’s self-service and administrative capabilities, including:

·       Exporting individual visuals and report content

·    Email-based report delivery through subscriptions

·    Saving report states via bookmarks

·    Embedding paginated reports

·    Sharing reports with other users

·    Enabling report authoring for analytics administrators


Challenges

The client estimated a 12-month development timeline to roll out these features. However, the organization faced a number of challenges:

·       Pressing go-to-market demands from business stakeholders

·    The need for tight integration with an internal platform

·    Limitations on using Microsoft Graph API due to compliance and infrastructure constraints

·    Strict security and access control requirements

·    A steep learning curve associated with Power BI embedding

·    Complex requirements around custom subscribe and share functionality


Our solution

To meet their goals efficiently, the customer partnered with MAQ Software to implement EmbedFAST, our production-ready accelerator for Power BI embedded analytics. EmbedFAST enabled:

·       Rapid deployment of export, bookmark, subscription, and paginated reporting features

·    Admin-level controls for managing subscriptions and report authoring

·    Seamless integration into the existing internal reporting environment

·    A centralized interface to manage all embedded content

·    More efficient licensing using capacity-based Power BI models

·    Customizable, secure sharing workflows aligned with enterprise governance policies

The entire solution was delivered in just three months, a significant improvement from the original 12-month timeline.


Collaborative development

The EmbedFAST solution was customized throughout the engagement to reflect client feedback and technical constraints:

·       Authentication & access control: Integrated with the client’s internal identity provider, supporting secure authentication and multi-tenant environments

·    Custom user management: Developed a bespoke user management module to replace Microsoft Graph API, ensuring compliance with internal security and infrastructure policies

This feedback-driven development approach ensured a solution that met the customer’s operational, governance, and user experience needs.


Business impact

·       Reduced implementation timeline from 12 to three months

·    Minimized custom development through prebuilt, scalable components

·    Accelerated access to enhanced analytics for business teams

·    Improved administrative workflows for data and analytics stakeholders

·    Established a scalable, secure foundation for future reporting enhancements


Contact us

At MAQ Software, we’re proud to help organizations unlock the full potential of Power BI Embedded. Contact us at CustomerSuccess@MAQSoftware.com to learn how EmbedFAST can accelerate your reporting goals.

July 11, 2025

Get ideas to code from months to hours with DevelopFAST




In today’s fast-paced development landscape, teams often struggle with turning feature ideas into well-documented, implementation-ready assets. Product Owners face challenges in breaking down requirements clearly, while developers lose time on repetitive documentation, unstructured planning, and missed technical steps. DevelopFAST bridges this gap by leveraging generative AI to transform raw feature inputs into structured stories, technical plans, test cases, and code—accelerating delivery and improving quality across the board.


Problem statement

Product Owners often miss key technical or functional details when breaking large features into smaller stories. Developers, on the other hand, may find documentation burdensome and often skip or inconsistently follow critical software development steps. This leads to fragmented delivery, missed dependencies, and lower code quality.


Key features

DevelopFAST uses a customizable, generative AI-powered engine to generate:

·       User stories based on feature descriptions and acceptance criteria

·    Technical approach documents with architecture notes and security considerations

·    Test cases tailored to defined scenarios

·       Three possible implementation approaches to guide decision-making

·    Auto-generated pseudocode and code snippets aligned with best practices

·    Seamless integration with enterprise code repositories and documentation portals

·    Chat history and sharing saves prior executions and allows sharing with teammates — improving collaboration and traceability

·    Embedded security standards and organizational checklists built into the code and documentation output


Solution details

·       Groundedness testing ensures 60–80% similarity to source content from real implementations

·    Customizable templates to match your organization's delivery and documentation standards

·    AI pipelines for user story generation, formatting, pseudocode, and testing

·       Plug-and-play integration with existing SDLC tools (e.g., Azure DevOps, Git, Confluence)


Case study

In a recent enterprise deployment, DevelopFAST helped streamline the entire SDLC pipeline:

Impact area Improvement
Feature breakdown Structured user stories generated instantly
Bug resolution planning Reduced from 5 hours to 30 minutes (~90% improvement)  
Test case creation Achieved ~50% time savings
Documentation consistency   Standardized across all tickets and tasks
Developer onboarding Accelerated due to uniform, well-documented assets

DevelopFAST accelerates the software development lifecycle by reducing cognitive load and automating key delivery artifacts. It begins with an AI-driven Reflection phase that interprets raw requirements and generates structured user stories and test cases. The system proactively identifies architectural considerations and surfaces unresolved technical questions for early clarification.

One of the platform’s key strengths is its ability to propose three unique solution approaches based on the problem context, enabling teams to compare and select the most effective path forward. Once an approach is chosen, DevelopFAST auto-generates pseudocode, design documentation, unit tests, and production-grade code.

This results in faster planning cycles, improved code consistency, and significantly reduced time to first commit.


Using the tool


Figure 1: Reflection
Figure 2: User Story
Figure 3: Acceptance Criteria Test Cases
Figure 4: Architecture
Figure 5: Possible Approaches
Figure 6: Code Annotator (Final Code)

Looking ahead

We're continuously evolving DevelopFAST to provide more control and adaptability. Here's what's coming next:

·       Fully configurable admin panel to manage prompt logic, output sections, integration points, and user permissions from a central dashboard

·    "Help Me Write" mode to assist users in drafting user stories, test cases, or technical notes step-by-step

·    Persona-based output filtering to restrict outputs to Product Owner view, Developer view, or Architect view


Interested in learning more?

To learn how DevelopFAST can accelerate delivery and improve code quality, contact our team at CustomerSuccess@MAQSoftware.com today.
 

Accelerating data-driven decisions with AI-DataLens




In today’s data-driven world, organizations often struggle to make sense of business analytics reports or structured data trapped behind technical interfaces. Business users rely heavily on analysts or IT teams to extract insights, leading to delays, misinterpretations, and bottlenecks. AI-DataLens bridges this gap, transforming how users interact with data by enabling natural language queries and delivering instant, intelligent insights.

Technical barriers to data insights

Most analytics tools, while powerful, require users to have expertise in SQL, DAX, or other complex query languages. As a result:

·       Business users cannot independently extract insights.

·    Data remains underutilized.

·    Time-to-insight is slow.

AI-DataLens is a solution that democratizes data access.

Figure 1: User interface

Key features


·       Chat with structured data: AI-DataLens allows users to interact with enterprise datasets using natural language. No need for SQL or DAX—just ask questions and get instant answers.

·    User guidance: Delivers intelligent question suggestions and personalized investigative recommendations to help new users effectively explore and understand their data from the very beginning.

·    Semantic relevance: Understands the true intent behind user questions by analyzing metadata and context, ensuring accurate and meaningful responses.

·       Automated visual generation: Automatically generates relevant charts and visuals from user queries. This eliminates manual effort and helps users quickly interpret the data visually.

·    Insight generation: Beyond raw data, the system generates narratives, titles, and summaries to present insights in a business-friendly format for decision-making and provides recommendations.

·    Intelligent anomaly & trend detection: Detects deviations, emerging patterns, and trends in the dataset automatically, highlighting critical metrics and performance shifts.


Key benefits

1.       Accelerated and informed decision-making

2.       Reduced operational bottlenecks

3.       Plug-and-play extensibility


Upcoming features

1.       Automated report generation

2.       KPI-based proactive alerts

3.       Admin-level governance & cost controls

3.       Support for voice queries


Interested in learning more?

To learn how AI-DataLens can transform your interactions with data, contact our team at CustomerSuccess@MAQSoftware.com today.
 

July 10, 2025

Migrating to Microsoft Fabric to Unlock One Source of Truth (OSOT)


Introduction

For enterprise organizations, achieving one source of truth (OSOT) is key to ensuring data consistency, accuracy, and efficiency. Our client—a multinational technology corporation—sought to establish OSOT while enabling AI-powered capabilities in their reporting and analytics workflows. Microsoft Fabric, with its unified platform and built-in AI features, was the ideal solution to help achieve their objectives.


Business challenges

The client, a global technology enterprise, had been relying on Azure Databricks for analytics and reporting for over a decade. However, data duplication and performance issues from the original lift-and-shift migration to Databricks led to increased costs and inefficiencies. Furthermore, following the launch of Microsoft Fabric, the client wanted to adopt the platform early to unlock AI capabilities. The migration presented a key opportunity for a major data clean-up and to establish OSOT while maintaining legacy functionality.


Migration stages

The migration comprised six key stages:

1.     Assess & evaluate: Define goals, assess current architecture, and determine technical fit.

2.     Plan & design: Design target architecture using the Well-Architected Framework, assess migration readiness, estimate resources, and build a migration plan.

3.     Pilot: Migrate an identified workload to Fabric, identify automation opportunities, and review final outcomes.

4.     Migrate & optimize: Migrate data products, pipelines, and semantic models using accelerators, optimize cost, performance, security, and scalability.

5.     Monitor & govern: Track cost, security, and performance, assess governance, and set up operational dashboards and alerts.

6.     Establish Fabric COE: Standardize practices, define best practices, and lead strategic initiatives for long-term success.


Our innovative approach

To support a smooth and efficient migration, we introduced a range of innovations that improved speed, consistency, and quality across the project. These included custom-built automation tools, the use of Microsoft Copilot for accelerated code generation, and selective integration of open-source libraries. Together, these innovations helped reduce manual effort, allowing our team to deliver faster while ensuring high data quality and preserving legacy functionality.

Additionally, given the scale and complexity of the data landscape, a key challenge was consolidating reports and models without losing critical legacy capabilities built on Azure Databricks. We ensured seamless integration with Power BI by selecting DirectLake via the SQL endpoint. This delivered superior performance (due to caching) and introduced fewer security concerns than alternatives.

We also implemented a data mesh architecture to promote domain-oriented ownership, improve scalability, and enhance data governance across teams.

Key automation and AI-enhanced capabilities included:

·       Code analysis: Automated end-to-end lineage and metric traceability to perform faster impact analysis.

·    Shortcut creation: Created a tool to automatically create and manage shortcuts for upstream sources.

·    Notebook and pipeline migration: Streamlined conversion of Databricks notebooks and pipelines to Fabric using a customizable, rule-based AI engine and dependency object creator.

·    Semantic model migration: Automated migration using BIM files, enabling seamless transfer of measures, relationships, and hierarchies. The process also includes intelligent correction of mismatched relationships to ensure model integrity.

·    Data quality validation: Created an event-driven framework to perform advanced validation checks, provide smart recommendations, and enforce Fabric best practices with custom AI agents.

·    Migrate and optimize: Accelerated workload migration using purpose-built tools, while optimizing for cost, performance, security, and scalability.

Lastly, ADO Copilot (MerlinBot) was used to review Pull Requests and provide recommendations based on feedback, further streamlining development and reducing manual overhead. These innovations collectively enabled an accelerated, low-risk migration that preserved legacy capabilities and delivered AI-powered analytics.


Figure 1: Solution architecture

Outcomes

·       OSOT: Users can now access a trusted, unified view of data directly from the SQL endpoint.

·    Report and model consolidation: Reduced the number of reports by 48% (from ~110 to ~50) and models by 50%, significantly reducing the data footprint.

·    Faster time to insight: Monthly financial data is now available 40% earlier, improving decision-making during the fiscal close period.

·    Clear ownership: The new data mesh architecture clarified domain ownership, enhancing accountability and governance.

·    Reduced costs: Projected reduction in sustained platform costs by 15–25% due to consolidation and architectural improvements.


Interested in learning more?

As a Microsoft Fabric Featured Partner, MAQ Software brings deep expertise in helping organizations unlock the full potential of Microsoft Fabric. Whether you're looking for guidance on implementing data solutions or optimizing your existing platform, we’re here to support you every step of the way.

Contact us at CustomerSuccess@MAQSoftware.com or explore our apps or consulting services on Microsoft Azure Marketplace: