March 24, 2025

Optimizing data management by integrating Snowflake and Microsoft Fabric

Project overview

A leading office supply retailer struggled to manage its massive inventory dataset, which exceeded 10 billion records over a rolling two-year period. Reporting in Import or DirectQuery mode led to performance issues, refresh failures, and inefficiencies. Users also faced challenges with Snowflake mirroring in Microsoft Fabric, including inconsistent incremental updates and the need to restart refresh processes, causing delays and redundant reprocessing. To address these issues, we integrated Apache Iceberg tables into Fabric, providing a scalable and efficient solution for handling large volumes of data.


Solution implementation

The project accessed Apache Iceberg tables in Microsoft Fabric to process large-scale inventory and sales data. The technology stack included Snowflake and ADLS as data sources, with Fabric and Power BI as the visualization tools. The data pipeline involved:

·       Fetching data from ADLS and creating Iceberg tables in Snowflake.

·    Establishing cloud storage structures and linking them to Fabric Lakehouse.

·    Connecting Iceberg tables from ADLS to Microsoft Fabric via shortcuts.

·    Replicating inventory and sales ad hoc reports in Power BI.

·    Conducting performance, functionality, and data integrity checks before deployment.


Figure 1: Solution architecture

Technical challenges and resolutions

We encountered authentication issues when creating shortcuts with Service Principal and Org Accounts. To resolve this, we tested alternative authentication mechanisms and scheduled data synchronization to ensure seamless updates.


Key benefits

·       Improved performance: Iceberg-based reports load in 10–12 seconds, compared to 14–18 seconds for standard reports, even with 20 concurrent users.

·    Efficient data processing: Resolved dataset refresh and load time issues under complex filtering and high concurrent usage.

·    Robust data integrity: Ensured strong guarantees for atomicity, consistency, isolation, and durability (ACID), optimizing large-scale data management in Power BI.


Interested in learning more?

As a Microsoft Fabric Featured Partner, MAQ Software brings deep expertise in helping organizations unlock the full potential of Microsoft Fabric. Whether you're looking for guidance on implementing data solutions or optimizing your existing platform, we’re here to support you every step of the way.

Reach out to  CustomerSuccess@MAQSoftware.com to discover how Power BI and Snowflake can enhance your business operations, improve customer satisfaction, and drive cost savings.

Transforming supply chain analytics with Power BI on Snowflake for a specialty retailer

In today’s fast-paced business environment, effective supply chain analytics is crucial for success across industries. By integrating Power BI for reporting with Snowflake as a backend data platform, organizations can transform their approach to supply chain data. This integration enables real-time insights, facilitating smarter decision-making and responsiveness to market demands, optimizing operations and enhancing efficiency.


The issue

A leading U.S. retailer with over 300 stores faced challenges in managing its supply chain data and reporting systems. Despite a strong legacy of quality, customer satisfaction, and sustainability, their reliance on MySQL for data management and Qlik Sense for reporting was becoming increasingly inefficient. MySQL’s limited scalability led to performance bottlenecks as data volumes grew, while complex queries slowed processing times, making it difficult to manage large datasets effectively. To address these issues, the company decided to migrate to Snowflake, a scalable, cloud-based solution that supports real-time analytics and seamless integration with various data sources.

On the reporting side, the client faced difficulties with Qlik Sense, such as high costs, maintenance, and performance problems with large datasets. Additional expenses for certain add-ons further complicated their reporting infrastructure. By transitioning to Microsoft Power BI, they will gain a more cost-effective and user-friendly solution. This migration, combined with Snowflake’s integration, will improve data connectivity and streamline their reporting processes.


Our solution

Migrating from Qlik Sense to Microsoft Power BI provided the client with cost savings and a more user-friendly interface, simplifying report creation and sharing across the organization. Power BI’s seamless integration with Snowflake enables real-time analytics and advanced data visualizations, enhancing decision-making in supply chain operations. To further improve scalability and performance, the client transitioned from MySQL to Snowflake, addressing MySQL’s limitations with faster data processing, efficient handling of large datasets, and high concurrency support through its multi-cluster architecture and caching capabilities.

The migration followed a structured, multi-step process to ensure a seamless transition:

·       Data transformation: Transformed raw data in Snowflake using a dedicated layer to process and generate Power BI-ready views.

·    Data loading: Configured a unified semantic model, defining relationships in Power BI.

·    Report building: Developed intuitive Power BI reports covering all necessary data points.

·    Power Automate: Automated data updates by detecting changes in raw tables.

·    GitHub integration: Enabled version control and collaborative development across Snowflake and Power BI.


Figure 1: Power BI reporting powered by Snowflake

Furthermore, we implemented best practices in Power BI with Snowflake as the data source to optimize performance.


·         Efficient reports: Limited visuals in Power BI to enhance performance with large datasets. Snowflake’s query optimization retrieves only necessary data, reducing query complexity and compute costs.

·     Optimal connection: The data gateway keeps data in Snowflake via standard mode, minimizing duplication and processing.

·     Seamless data querying: Power BI’s DirectQuery mode enables real-time interaction with Snowflake, leveraging its capacity for large, concurrent queries.

·     Elastic scaling: Snowflake’s multi-cluster warehouses and auto-scaling compute model adjust resources based on workload demands, ensuring smooth Power BI queries even at peak times.

·     Data model design: Star schemas simplify the model and accelerate queries, while denormalizing frequently used metrics improves efficiency.

·     Row-Level Security (RLS): Implemented RLS using Snowflake’s native features to ensure users see only relevant data, with complex calculations managed at the source for greater efficiency.


Business outcomes 

Within four months, we migrated eight critical reports to Power BI, delivering a near-real-time analytics solution. This transition reduced operational costs, improved ROI, and provided a unified, automated system. The client can now seamlessly access insights, enabling confident, data-driven decisions.

Key improvements include:

·         Faster report performance: Reports now load in near-real-time, eliminating data latency issues.

·     Cost savings: Optimized data retrieval in Snowflake and Power BI’s affordable licensing reduced expenses.

·     High user adoption: Over 80% of end users transitioned to Power BI, driving a data-centric culture.

·     Enhanced accuracy: The Power BI semantic model simplified maintenance and improved reporting precision.


Conclusion

By integrating Power BI with Snowflake, we transformed the client’s reporting capabilities, boosting efficiency, reducing costs, and strengthening decision-making. These advancements position them for continued success in a data-driven landscape.

Contact CustomerSuccess@MAQSoftware.com to discover how Microsoft Fabric and Snowflake can optimize your business, enhance customer satisfaction, and unlock significant cost savings.

March 19, 2025

Enhancing demand forecasting and planning with AI foundation model on Microsoft Azure



A leading office supplies retailer sought to transform its demand forecasting and inventory management to enhance customer engagement and operational efficiency. Facing inaccurate demand predictions, complex customer behaviors, and a lack of AI-driven insights, the company needed a scalable, data-driven solution. We developed an AI-powered forecasting model using Microsoft Azure to refine demand predictions, optimize inventory, and drive strategic decision-making.


Key challenges

Inventory management in a dynamic retail environment is complex. The company struggled with:

·       Inaccurate demand forecasting: Traditional methods failed to predict spending behavior, leading to overstocking and stockouts.

·    Customer behavior complexity: Diverse purchasing patterns required a more granular understanding of demand drivers.

·    Limited AI utilization: The absence of AI-driven insights hindered the ability to forecast demand accurately across multiple categories.

·    Customer retention: Identifying high-value customers for targeted marketing was a challenge.

·    Holistic data integration: Internal data (historical transactions, website activity) needed to be combined with external market insights (economic trends, industry news) for more accurate forecasting.


The solution

To address these challenges, we developed an AI-powered demand forecasting model on Microsoft Azure. The solution featured:

·       Retail-specific AI foundation model: A machine learning-driven forecasting model, fine-tuned with Azure AI services.

·    Classification and regression models: Predictive analytics to identify high-potential customers and estimate spending trends.

·    Feature engineering: Key customer attributes such as purchase recency, firmographics, and product preferences were incorporated.

·    MLOps for model deployment: Azure MLFlow and Databricks enabled seamless monitoring and optimization.

·    External data integration: Economic indicators, industry news, and seasonal demand factors were incorporated to enhance forecasting accuracy.

Figure 1: Solution architecture



Implementation process

1.       Data preparation and engineering:

·         Aggregated and cleaned historical transaction data, marketing insights, and external sources.

·         Identified key predictive features, filtering out non-relevant customer segments.

2.       Model development and training:

·         Developed and fine-tuned AI models using Azure AI services.

·         Tested multiple algorithms (LightGBM, XGBoost, Huber, KNN) for optimal performance.

3.       Optimization and deployment:

·         Hyperparameter tuning for improved accuracy.

·         Integrated with Azure MLOps for automated tracking and real-time feedback loops.

·         Enabled continuous learning to adapt to market fluctuations.


Business impact 

The AI-powered forecasting solution delivered measurable improvements:

·         Increased forecast accuracy: Achieved a 50% improvement in precision and 69% in recall for demand predictions.

·     Optimized inventory costs: Reduced overstocking and stockouts, leading to a 15% decrease in inventory costs.

·     Higher customer retention and sales: Targeted marketing campaigns resulted in increased revenue and customer engagement.

·     Enhanced operational efficiency: Streamlined inventory management, improving supply chain responsiveness by 20%.


Conclusion

By leveraging AI and Microsoft Azure, we enabled the retailer to revolutionize its demand forecasting and inventory planning. The solution not only optimized inventory but also enhanced customer engagement and sales growth. With integrated AI-driven insights, the company can now navigate market fluctuations with confidence, ensuring a competitive edge in the retail industry.

To learn how MAQ Software can help optimize your demand forecasting and inventory management, contact our team at CustomerSuccess@MAQSoftware.com.

March 18, 2025

Accelerate your Tableau to Power BI (Fabric) migration with MigrateFAST


Organizations today face a rapidly changing business landscape, where data is the key to staying competitive. As the demand for scalable, cost-effective solutions grows, many companies are making the shift from Tableau to Power BI. With AI-powered capabilities embedded in the Power BI workload within Microsoft Fabric, businesses can harness automation, self-service analytics, and advanced decision-making tools at a lower total cost of ownership.

The migration process from Tableau to Power BI can be overwhelming due to the complexities of planning, minimizing business disruptions, rebuilding reports and dashboards, adapting to a new interface, and managing costs—all while maintaining operational efficiency. MigrateFAST simplifies data and report migration through automation, reducing time-to-market and enabling a smooth transition with minimal disruptions.


6-step migration process with MigrateFAST

Transitioning from Tableau to Power BI can be smooth and efficient with the right approach. MigrateFAST’s highly automated process enables a seamless migration across six key stages, reducing costs and time-to-market.

1. Inventory Analysis: Laying the Foundation

Before the migration can take place, a complete inventory of Tableau workbooks, extracts, data models, and reports is essential to evaluate what needs to be migrated, what can be retired, and what may require re-engineering. Accurate and thorough inventory analysis ensures that you know exactly what you’re working with before diving into the migration process.

This step alone can take weeks without automation. MigrateFAST automatically analyzes Tableau data models and connections and accelerates the creation of metadata documents with a few clicks.

2. Estimation & Planning: Clarity from Day One

Successful migrations require a clear estimate of costs, risks, and unknowns. A well-laid-out plan can help prevent unexpected issues during migration and ensure that the project remains within scope and budget.

MigrateFAST automates the estimation of licensing costs, infrastructure requirements, and the creation of a detailed migration timeline. Using historical data and pre-built migration pathways, MigrateFAST reduces uncertainty, providing a clear and actionable plan for each phase of the migration.

3. Semantic Model & Report Creation: Delivering Consistent Insights

Recreating semantic models and reports is the most visible part of the process for end users. It is essential that reports in Power BI look and behave the same way as in Tableau, if not better. Reports should be optimized for Power BI’s interface and features. Moreover, identifying and removing redundant reports helps improve the efficiency of the migration and performance of Power BI reports after deployment.

MigrateFAST reduces manual efforts by:

·       Migrating LOD expressions to DAX measures

·       Automatically creating Power BI semantic models

·       Recreating Tableau visualizations in Power BI

4. Review: Ensuring Data Accuracy and Integrity

Migrating large amounts of data and reports from one platform to another can lead to performance issues, incorrect data mappings, or errors in calculations. Without thorough review, there is a risk that your reports could deliver incorrect or suboptimal results.

MigrateFAST automates the validation of the migrated reports by running performance tests, optimizing queries, and ensuring data accuracy through validation and certification.

5. Governance: Maintaining Control and Integrity

Data governance is essential in migration to support data validation and certification processes while ensuring dataset integrity. A well-defined governance framework provides proactive alerts and real-time monitoring of capacity, usage, report availability, and platform issues.

Our approach automates performance optimization to ensure adherence to best practices and pre-defined SLAs.

6. Center of Excellence (CoE): Supporting Long-Term Success

Migration doesn’t end once the reports are deployed. Post-migration support is essential for monitoring system performance, optimizing costs, and ensuring that the organization can effectively adopt and use the new platform.

We provide ongoing support to optimize costs, monitor system performance, and improve change management. This approach drives higher adoption and long-term success—helping your organization move from reactive fixes to proactive innovation.


Customer success story

A leading retail client with 25,000+ users needed a scalable, secure, and resilient centralized data platform. Their existing infrastructure included 300+ workbooks, 350+ extracts, and a 200 TB Teradata database, supporting 200+ concurrent users. After migrating from Tableau to Power BI, the client achieved:

      •  60% reduction in overall maintenance and new development effort 
      •  35% cost savings on platform and maintenance 
      •  NSAT score improvement from 2 to 3.5 
      •  300% increase in adoption within a year through CoE setup 

With MigrateFAST, the client’s migration effort was automated by approximately 60%, dramatically reducing manual intervention and shortening the migration timeline.

  Migration Steps   Automated using toolkit     Efforts without MigrateFAST (in hrs.)     Efforts with MigrateFAST (in hrs.)  
Analyze Tableau data model and connections 10 0
Create metadata document 5 0
Create Power BI Semantic model 16 0
Understand calculations and recreate them in DAX 24 0
Validate and fix calculated columns and measure definitions   16 20
Understand and create field parameters 8 4
Create Power BI reports with base measures 16 4
Implement field parameters in the report 4 2
Identify alternate visuals for gaps 4 4
Optimize performance (CertyFAST) 8 4
Data validation 8 8
Total 119 (~25 days) 46 (~6 days)

Figure 1: Migration effort with MigrateFAST per workbook



MigrateFAST ensures a seamless transition to Power BI, preserving Tableau’s look and feel while enhancing usability with AI-powered capabilities.

Figure 2: User interface of Tableau vs. Power BI

 

Expanding MigrateFAST Capabilities

Beyond Tableau, MigrateFAST streamlines migration from multiple platforms, including MicroStrategy, Cognos, SAP BOBJ, and more. Our customized approach aligns with your data architecture requirements, ensuring a seamless transition to help you harness Power BI’s full capabilities.

Accelerate your migration journey with our highly automated 6-step process. Contact CustomerSuccess@MAQSoftware.com to get started today.
 

March 11, 2025

Revolutionizing demand forecasting and planning with AI foundation model on Microsoft Azure

A leading global food and beverage company faced significant challenges in accurately predicting demand and optimizing inventory across its diverse product portfolio. As a result, the organization faced higher costs, missed sales, and an inability to adapt quickly to market fluctuations. To address these challenges, we developed an AI-powered demand forecasting and planning solution leveraging hyperparameter tuning at scale and MLOps on Microsoft Azure. This improved forecasting accuracy, reduced costs, and enhanced business outcomes.

Many technologies enable advanced supply chain analytics by offering unique capabilities in data integration, processing, and real-time insights. Among these, Snowflake stands out due to its powerful data-sharing capabilities, high-performance computing, and seamless integration with BI tools—making it valuable for optimizing and scaling supply chain operations.


Business challenges

Prior to adopting the AI solution, the company faced several key challenges:

·       Inaccurate demand forecasts: Complex data, outdated forecasting methods, and the need for detailed predictions at various levels (location, segment, customer, and package) led to stock shortages and excess inventory. This resulted in lower customer satisfaction and profitability.

·    Manual and inefficient processes: Reliance on manual methods for updating and validating forecasts was time-consulting and prone to errors. This slowed decision-making.

·    Poor Inventory Alignment: Inventory levels often failed to match consumer demand, leading to high holding costs and lost sales opportunities.

·    Lack of Planning Flexibility: Rigid forecasting processes struggled to adapt to a diverse product portfolio and rapidly changing market conditions.


The ask

To solve these challenges, the company needed a solution that could integrate real-time data, use AI for better forecasting, automate processes, and optimize inventory levels. It also had to be scalable for growth and continuously improve through MLOps. To achieve this, they sought an experienced AI and MLOps partner to build a custom solution on a reliable cloud platform.


Solution overview and implementation

We developed a custom AI-powered demand forecasting and planning solution using Microsoft Azure, AI Services, hyperparameter tuning at scale, and MLOps practices. The structured implementation followed a phased approach, ensuring seamless integration, optimization, and deployment.

1.       Data integration and preprocessing:

·         Real-time data collection: Integrated sales, marketing, holiday, and promotion data using Azure Data Factory to enhance forecast accuracy.

·         Storage and preprocessing: Stored data in Azure Data Lake Storage (ADLS), handling missing values, outliers, and inconsistencies for a robust data pipeline.

2.       AI model development and optimization:

·         Model Selection: Developed an industry-specific AI model using Random Forest Regressor and XGBoost, trained on historical data.

·         Feature engineering: Created key features such as datetime attributes, holiday indicators, promotion effects, and rolling averages to improve predictive accuracy.

·         Hyperparameter tuning at scale: Leveraged Pandas UDF and HyperOpt for parallel tuning across customer segments, refining the model for optimal performance.

3.       Deployment and automation with MLOps:

·         Model validation and monitoring: Implemented pre- and post-check frameworks with MLFlow experiments for continuous tracking of model health and accuracy.

·         Automated ML pipeline: Automated data preparation, model training, validation, deployment, and monitoring to achieve high MLOps maturity.

·         Azure deployment: Deployed models on Azure Machine Learning (AML) for scalability and real-time demand forecasting.

4.       Custom web application for forecasting:

·         User interface: Built an intuitive web application using Azure Web Services for visualizing forecasts and manual adjustments.

·         Data management: Integrated with Azure SQL Database for seamless storage and retrieval of forecast results.

Figure 1: Architecture
Figure 2: Forecasting flow from data sources to data push



Solution highlights 

·         Interactive web application: Provided granular volume and price forecasts, enabling managers to plan strategically and respond proactively to market shifts.

·     Forecasting models: Developed over 1 million models to generate highly detailed forecasts across location, segment, customer, and package levels, fine-tuned for optimal performance.

·     Automated processes: Automated data validation, model training, hyperparameter tuning, and monitoring, significantly reducing manual effort and enhancing decision-making.


Key results and business outcomes 

Our AI-powered demand forecasting solution addressed the client’s challenges to streamline demand forecasting and inventory management. Key outcomes included:

·         Improved Forecast Accuracy: The solution achieved over 90% accuracy for key product categories and high forecast accuracy across various segments, with rates above 80% for the top 35 locations. This level of precision in forecasting is crucial for making informed inventory and sales decisions.

·     Reduced Manual Effort: By automating the forecasting process, we reduced the turnaround time for updating forecasts from two weeks to just 30 minutes. This significant reduction in manual effort allows their team to focus on strategic initiatives rather than routine tasks.

·     Scalability: The solution demonstrated its ability to handle a large volume of data and models, providing scalable demand forecasting and planning capabilities that can grow with their business.

·     Model Performance: Through hyperparameter tuning and continuous monitoring, we improved overall error rates and accuracy metrics, ensuring that the forecasting models perform optimally over time.

·     Enhanced Planning Capabilities: The solution enabled integrated business planning by offering a unified view of demand and supply and supported dynamic market planning by incorporating real-time data and external events into forecasts.

·     Optimized Inventory Management: AI-driven demand predictions led to a 20% reduction in stockouts and more accurate inventory levels, which in turn improved customer satisfaction by ensuring timely product availability.

·     Reduced Inventory Holding Costs: More accurate demand predictions resulted in a 15% reduction in inventory holding costs, directly impacting the company’s bottom line.


Conclusion

The AI-powered demand forecasting solution on Microsoft Azure transformed the company’s inventory management, planning, and forecasting capabilities. The solution facilitated faster development and deployment of new models, ensuring that the business can adapt and remain competitive in the food and beverage industry.

To learn how MAQ Software can help optimize your demand forecasting and inventory management, contact us at CustomerSuccess@MAQSoftware.com.