September 16, 2024

Transforming data management in manufacturing with an AI-driven chatbot solution

  












Generative AI (Gen AI) is transforming manufacturing by enabling advanced data analysis, automation, and faster decision-making. Tools like OpenAI's GPT-3.5 can help further streamline processes, improve data access, and drive innovation. Gen AI solutions—such as chatbots—boost efficiency by providing quick, accurate responses, reducing manual tasks, and increasing productivity. All in all, these factors combined can give manufacturers a competitive edge.

 

About our client

Our client is a globally recognized manufacturer specializing in industrial test, measurement, and diagnostic equipment. Renowned for their innovative solutions, they serve a broad range of industries, including electronics, industrial automation, and electrical engineering. Their commitment to precision and reliability has established them as a leader in their field.


The issue at hand

The client managed their project lifecycle management data and processes through multiple documents stored on a SharePoint website. While SharePoint also hosted the latest news in the form of .aspx pages, users found it challenging to locate specific documents and extract the necessary information. This lack of efficient data retrieval hindered productivity and slowed down decision-making processes. To address this, the client needed a generalized chatbot that could simplify data access and improve workflow efficiency.


How we stepped in with a Generative AI solution

To address the client’s data management challenges, a chatbot powered by advanced AI technologies was developed. The aim was to create an intuitive system that allowed employees to easily query information from a centralized repository of project-related documents. This AI-driven chatbot not only improved data accessibility but also ensured accurate responses from trusted sources. By integrating Azure AI services with OpenAI’s GPT-3.5 model and Azure Cognitive Search, the solution was built to improve operational efficiency. The chatbot became a critical tool in the client’s project management, enabling faster decision-making and reducing time spent on manual data searches.


The solution was implemented through a multi-step process that ensured seamless integration of various data sources and technologies:

1.     Data ingestion: Power Automate flows automated the copying of files from SharePoint and static Excel sheets to Blob storage, triggered by file additions or modifications. Data included PDFs, Excel files, and .aspx pages to cover the organization’s document repository.

2.     Data processing: Custom code was developed to decode .aspx pages and extract relevant text, saved as .txt files for inclusion in the data workflow. Doing this ensured all formats were accessible to the chatbot.

3.     Indexing: Azure Cognitive Search—a service within Azure AI that specializes in indexing large volumes of data—indexed data from Blob storage. The service processed various formats such as PDFs, Excel, and .txt files, creating a searchable index that served as the chatbot’s backbone.

4.     AI integration: With the data indexed and ready for querying, the next step involved deploying OpenAI’s GPT-3.5 model. The model was integrated with Azure Cognitive Search to generate precise, context-based responses using the indexed data. When a user submitted a query, GPT-3.5 analyzed the ranked results returned by Azure Cognitive Search and crafted a well-informed answer based on the indexed data.

5.     User interaction: The chatbot interface, hosted on an Azure WebApp, allowed for real-time interactions. Cosmos DB stored historical conversations for future reference and analysis.

6.     Additional features: To further optimize the solution, backend processes were implemented for data management and cost efficiency. These included converting old files from Hot to Cold storage and processed data from Cosmos DB into gold tables for reporting and analysis.


Figure 1: Components of the AI-driven chatbot solution



How the solution yielded business value

Implementation of the AI-driven chatbot solution yielded improvements across the organization’s operations. Through this solution, the organization was able to dramatically improve the accessibility and usability of their project lifecycle management data. The chatbot not only empowered employees to quickly find the information they needed but also facilitated more informed decision-making.


The post-deployment measures demonstrated the effectiveness of the solution:

·       Improved data retrieval time: The time required to locate and retrieve specific documents or information was reduced drastically, allowing employees to focus more on high-value tasks.

·       Increased employee productivity: With quick access to accurate information, overall employee productivity saw an increase. This was measured by the reduction in the time spent on non-productive activities like manual data searches.

·       Reduction in operational costs: By automating data retrieval and integrating backend processes like Hot to Cold storage conversion, the organization achieved a large reduction in operational costs related to data management.

·       Increased data accessibility: The centralization and indexing of diverse data formats made 100% of the project lifecycle management data accessible through a single interface. This meant that no critical information was overlooked.


These results underscore the success of the chatbot solution in transforming the organization’s approach to data management. By using advanced AI technologies, the client not only optimized their internal processes but also positioned itself for greater efficiency and competitiveness in the market.

For any further inquiries, contact Sales@MAQSoftware.com to see how chatbots powered by Gen AI can transform your business, improve productivity, and accelerate your delivery.

September 10, 2024

[OLD CONTENT] Everything You Need to Know About Migrating to Power BI

 


What is Power BI?

Power BI is a business intelligence (BI) tool that enables users to easily transform their data into actionable insights. Business intelligence refers to the collection and analysis of business operation data. Insights from this data enable business leaders to identify growth opportunities and close operational gaps.

Previously, BI reporting platforms required developer expertise. Today, anyone can develop intuitive, insightful dashboards and reports with the right tool. Power BI is a powerful, easy-to-use tool that offers a wide range of storytelling visuals that help you understand your business opportunities. 

Power BI Migration Strategy

Large-enterprise companies rely on insights from BI platforms. When companies use unoptimized platforms, insights are slow, inaccurate, and siloed, delaying business-critical decisions for weeks. The most common issues our clients face with other BI platforms are:
1.    Slow-loading pages and reports
2.    Difficulty managing and maintaining multiple data sources
3.    Decentralized reporting
4.    High maintenance costs

Increasingly often, large-enterprise companies are turning towards Power BI. Power BI’s centralized, dynamic reporting better addresses their real-time business needs. However, migrating large volumes of data from enterprise systems can be challenging.
Managing terabytes of data and training thousands of team members in a new system requires meticulous planning. That’s where we come in. After leading over 100 Power BI migrations for large-enterprise companies, and implementing over 8,000 Power BI solutions, we’ve developed a simple six-step migration strategy. With our strategy, you can rest assured that we’ve covered all the bases, enabling you to migrate seamlessly to Power BI.

1. Requirement Gathering and Analysis

Before we start actually moving data, it’s important to understand your current landscape. This means evaluating the existing reporting platform to understand your current needs, key functionalities, and gaps. We examine reports, dashboard usage, UI/UX, audiences, data sources, and security to create a report inventory and data estate. This information determines your migration scope, performance requirements, and complexity.

2. Planning and Design

Now that we understand your existing landscape, it’s time to move onto developing a road map. This sets the stage for the migration’s success. As Antoine de Saint-Exupéry once said, “A goal without a plan is just a wish.”

Here, we propose a solution based on all the requirements gathered in step one. To ensure everyone agrees with the plan of action, we set up a proposal meeting that involves architects, data administrators (admins), infrastructure admins, legal and security teams, and the Power BI product team (if required).

In general, we divide planning and design into five sub-steps:

1. Perform a detailed gap analysis to identify the different features, visualization, and modeling challenges we need to address during migration
2.  Propose a Power BI architecture, including security implementation, refresh latency, and report performance
3.  Design report templates and prepare mock-ups
4.  Define the scope for automated validation
5.  Propose a deployment strategy and implementation timeframe

3. Execution

Now, it’s time to implement the approved solution architecture. Because we spend so much time on the planning stage, this step is straightforward. To optimize our workflow, we follow the agile methodology with the following steps:

1.  Sprint plan: Create a product backlog and define the scope and length of sprints
2.  Implementation: using best practices, reusable templates, and themes, start the report migration and provide incremental report builds
3.  Performance tuning: Refine the architecture and report layout to optimize the data model for performance.
4. Testing: Use a set of in-house performance analysis tools to automate testing, which tracks query performance and suggests visual layout and data validation optimizations
5.  Deployment: Close our sprint by automating report deployment and readying the build for user acceptance testing (UAT)

4. Deployment and Post-Production

During this step, we ensure the new reports are user-friendly and high-performing. First, we conduct numerous UAT sessions. UAT ensures the reports are optimized for their target audience. Once we receive sign-off for UAT and production, it’s time for deployment. We automate deployment, giving end users immediate access to the reports. To complete the transfer of ownership, we hand off the code, report, and workspace inventory to our client.

For many companies, Power BI migration ends here. However, we believe that successful adoption is a critical part of migrating to Power BI. That’s why we dedicate the next two steps to post-migration success.

5. Center of Excellence (CoE)

According to Microsoft, “A Center of Excellence (CoE) drives innovation and improvement and brings together like-minded people with similar business goals to share knowledge and success, while at the same time providing standards, consistency, and governance to the organization.”

During our CoE trainings, we enable our clients to become self-sufficient Power BI users. We run numerous CoE sessions that train team members across your organization in Power BI capabilities, governance, and best practices. These enable technical users, business users, and operations team members to become familiar with the new data system as the old system is gradually moved offline. Our custom trainings include regular office hours with certified engineers, an advanced curriculum, and pre-built solutions and frameworks. On average, our CoEs shorten the Power BI adoption timeframe from years to months.

If you are already at this migration stage, or need some help boosting Power BI adoption, check out our virtual CoE trainings, offered to any organization year-round:
  Admin CoE 

6. Decommissioning

There’s nothing worse than a cluttered data system. To avoid redundancies, we systematically retire old reports. Here, our main goal is moving you onto the new system without impacting ongoing business operations. At MAQ Software, we believe migration to Power BI should be seamless.

Figure 1: Complete Process Overview

Benefits of Migrating to Power BI

By migrating to Power BI using our six-step approach, our clients have benefitted from:

Quicker Insights for Decisionmakers
  Reduced latency between data sources and reports  
  Increased scalability 

Self-Service BI
  Business users can create reports and customize dashboards without developer expertise  

Centralized Reporting
  Admins can easily manage and govern their organization’s reports with centralized administrative capabilities 
  Users can combine data from different sources, such as Dynamics 365, Office 365, and hundreds of other relational databases
  Increased accuracy by offering a single source of truth for all reporting layers through shared datasets 

Power BI Migration Case Studies

As the 2021 Microsoft Power BI Partner of the Year, we have experience migrating clients from a wide variety of data visualization platforms to Power BI. Our expertise enables us to easily manage large volumes of data and enable business continuity throughout the migration process. Here is a sample of how we’ve empowered our clients to migrate to Power BI.

Tableau to Power BI

Client: An international fast-moving consumer goods (FMCG) company.

Situation: Our client wanted to centralize their reporting platforms by migrating from Tableau to Power BI. As their existing Tableau reports were developed over time, it was complex to migrate them without compromising functionality.

How We Did It: We discussed each report in detail to understand its underlying business purpose. Then, we used our knowledge of Power BI to identify the best methods of achieving the same results in a new system. Spending time with the actual report users gave us insight into end user flow, enabling us to design an intuitive Power BI report.

Results: We migrated over 250 Tableau workbooks to Power BI. The new reports were better organized and decluttered. With easy navigation and optimized design, the new reports achieve the same functionalities as the old ones, with increased performance and accessibility. Our Center of Excellence trainings also helped increase post-migration adoption by 300%.

Qlik to Power BI

Client: EF Education First, a global education foundation with offices in 50 countries.

Situation: EF Education First needed a modern reporting platform with self-service analytics, easy scalability, and low operational costs.

How We Did It: We performed a gap analysis of the features and visualizations in Qlik and Power BI. Qlik supported 16 reports, with a data source consisting of 20+ SQL databases and 30 Excel sources. We ensured all required data could be transferred and visualized per the client’s needs.

Results: Power BI’s low-code architecture and cloud-based centralization, gave EF Education First access to self-service scalability.

Find out more about our QlikView to Power BI migration

SAP Business Objects (SAP BO) to Power BI

Client: A multinational food, snack, and beverage corporation with 250,000+ team members.

Situation: With our client’s high volume of data, their existing SAP BO reports took over five minutes to load. Running many slow-loading reports takes up the team’s valuable time, negatively impacting business operations.

How We Did It: We implemented a tabular model with Azure Analysis Services (AAS) to enable fast, efficient reporting in Power BI. Data loads from our client’s existing Teradata storage into AAS. For users with alternate view and calculation needs, reports can be exported directly from AAS to Excel. AAS is more equipped to store the huge models and pre-aggregated data needed for real-time visualization. AAS provides a dedicated central processing unit (CPU) and memory, independent of the load on premium capacity.

Results: Migrating from SAP BO to Power BI reduced reports run time by 90%. Previously, reports could take up to 5 minutes to load. With our solution’s back-end Azure Analysis Services (AAS), dense data now loads into Power BI in less than 20 seconds. Users can rapidly customize and run reports without the wait. AAS also has a built-in feature that provides time intelligence for KPIs on the fly.

MicroStrategy to Power BI

Client: A global Fortune 500 retailer.

Situation: Our client sends weekly and monthly report snapshots to subscribed internal and external users. MicroStrategy offers an easy and intuitive method to share reports like this. However, our client had recently migrated their other systems to Power BI as it offered better long-term scalability. To reduce costs, our client wanted to consolidate all functionalities to a single platform. We needed to implement a similar export/subscription functionality using Power Platform.

How We Did It: We used the existing subscription list and created a security model works with Power Automate schedules. Then, we converted data tables in MicroStrategy to paginated reports in Power BI. Using the Export API, the data can now be exported as an attachment to share with external and internal users.

Results: We helped our client retire their outdated MicroStrategy reports without losing their easy sharing capabilities. Because Power BI is part of the Power Platform, it integrates seamlessly with other powerful tools, such as Power Automate and Power Apps. Now, our client can view dashboards, manage reports, and share insights using a platform that is both scalable and sustainable.

Looker to Power BI

Client: A leading retail firm that provides office supplies. 

Situation: Our client sought a centralized (BI) platform that delivers low operational and maintenance costs while providing self-service analytics capabilities. They also required seamless migration from their on-premises data source to a cloud-based one. 

How We Did it: Our team established a centralized Power BI dataset by importing data from a cloud-based source. To optimize query performance and minimize costs, we implemented custom partitioning and incremental refresh policies in Power BI. By doing so, we reduced the overall number of queries fired to the cloud-based source. Our solution also met the customer's requirements for data refresh latency, ensuring that the dataset was always up-to-date and readily available for analysis. 

Results: We assisted the client in retiring their Looker reports and migrating to Power BI, empowering end-users with self-service reporting capabilities. With Power BI's user-friendly interface, users can easily customize their report views and gain valuable insights. Power BI's built-in export functionalities also enable users to seamlessly share their findings with others, making it a more collaborative and efficient tool for the client's reporting needs. 

Cognos to Power BI

Client: A global service provider in the Health, Tax & Accounting, Legal & Regulatory, and Finance industries. 

Situation: Due to our client's high volume of reports, their existing Cognos reporting system had a high cost per click—on top of having limited UI features. This drawback led to a downward impact on business operations. 

How We Did It: We implemented a tabular Microsoft SQL Server Analysis Services (SSAS) model that allowed for fast and efficient reporting in Power BI. The data from the client's existing data warehouse was loaded into SSAS, which is better equipped to store large models and pre-aggregated data needed for real-time visualization. With SSAS as the backend, reports can be generated directly from Excel for users with business priority and calculation needs. Additionally, SSAS provides a dedicated CPU and memory, which further optimizes the reporting process. Powerful features such as Export, Subscribe, and User Management (which can restrict users with lower privileges from publishing reports to the workspace) can easily be customized and managed using Power BI Report Server. 

Results: Migrating from Cognos to Power BI reduced the cost per click by ~50%, while the aesthetically appealing visuals also improved the usage of the reports. Our solution allows for dense data to load into Power BI in less than 3 seconds, allowing users to rapidly customize with a better UI and run reports without delay. With SSAS, there is a built-in feature that provides time intelligence for KPIs on the fly, which further enhances the reporting process. 

***

While our six-step migration strategy provides a general framework for success, each organization’s needs are different. Need help getting your Power BI migration on track? Partner with us by emailing Sales@MAQSoftware.com .


Up Next

Cover image for and link to the Power BI Best Practice Guide

To further improve your Power BI performance, check out our Power BI Best Practice Guide.

August 30, 2024

Modernizing reporting platforms through AI-driven SQL Migration

 












Need for transformation

A leading automotive retailer's legacy on-premises system had become a bottleneck, with an unresponsive user interface and a poor user experience. With this, they recognized the need to modernize its reporting infrastructure. The existing infrastructure had limited scalability, resulting in slow response times. There was also a growing inability to quickly create or adjust reports, making it difficult for executives and business managers to access timely and accurate insights required for business decisions.

To address these challenges, the company went on a digital transformation journey. The goals were to modernize its reporting platform and automate the migration of PostgreSQL objects to Snowflake. This would improve system performance and allow for efficient scaling to meet the growing reporting needs across the organization.

 

The ask

·        Modernize the platform: Transform the outdated on-premises SQL infrastructure to a scalable and responsive cloud-based solution.

·        Automate migration: Use OpenAI to automate the migration of on-premises PostgreSQL objects to Snowflake SQL.

·        Create a reusable framework: Develop a reusable API and UI framework that can scale with the company's growing reporting needs.


Tackling the task

Figure 1: Proof of concept diagram


To address these requirements, the project team conducted a comprehensive assessment and proposed a two-pronged approach: modernizing the platform and automating the migration process.

Migration strategy: The data warehouse was migrated from PostgreSQL stored procedures/functions to Snowflake views. OpenAI played a crucial role in automating and expediting this process. A Python tool was developed to take stored procedures/functions, prompts, and table mappings as input. It generated likely executable code tailored specifically for On-Premises to Snowflake SQL migration. This allowed the team to efficiently manage the complex migrations.

Dynamic routing and configurable UI: A dynamic routing framework was created for the API, along with a configurable UI framework. This made the solution highly adaptable, ensuring that the platform can scale seamlessly to meet new reporting requirements.



Challenges

Several technical challenges arose during the migration:

·        Stored procedure length: Large stored procedures were difficult to migrate due to the input and output token size limitations of Azure OpenAI models. To overcome this, the team manually added delimiters to split the code into manageable segments. This ensured accurate migration without breaking logic or joins.

·        Syntax errors: The initial proof of concept revealed syntax errors due to differences in SQL dialects between PostgreSQL and Snowflake. For example, functions like ISNULL in PostgreSQL were incompatible with Snowflake. The team iteratively refined the OpenAI prompts to avoid such issues.

·       Control flow statements: Converting control flow statements from procedural code in stored procedures to set-based views required innovative approaches. Loops were transformed into recursive CTEs, and conditional statements were converted into WHERE conditions.

·       Handling parameters in views: Unlike stored procedures, views do not accept parameters. This challenge was addressed by incorporating parameters as columns in the views and applying conditions to filter the data accordingly.

 

The solution and outcome

Figure 2: Results summary


The final solution led to significant improvements in the reporting platform:

·        Efficiency gains: The Python tool developed for the migration process enabled faster, error-free, and optimized code migration. Simple stored procedures that previously took 4 hours to migrate manually could now be migrated in just 1 hour using the tool. For complex procedures, the tool reduced migration time from 13 hours to 5 hours.

·        Improved user experience: The web application was modernized, becoming more responsive. This greatly improved the user experience. Overall response times improved, leading to a more efficient and user-friendly application.

·       Scalable framework: The reusable frameworks developed for the API and UI ensured that the solution could easily scale to meet future reporting needs. This reduced the overall effort required for subsequent projects.


Results

The implementation of this solution brought impactful benefits:

·       Consistency: The centralized framework ensured consistency across various applications, reducing discrepancies and improving data accuracy.

·       Efficiency: Automated access provisioning reduced time and manual effort, allowing teams to focus on more strategic tasks.

·       Security: Improved data security through role-based access control and security classification-based access, minimizing the risk of data breaches.

 

Future outlook

The configurable and reusable nature of the developed tool and frameworks positions the client for future success. The tool can be adapted for other engagements, offering scalability and integration potential with existing tools. This ensures the client is well-prepared to meet the evolving demands of the automotive industry with a robust and flexible reporting platform.


August 29, 2024

Unifying data management with a centralized framework

 


Need for transformation

A global corporation in the food and beverage industry recognized the critical need for a centralized data framework. This framework was necessary to unify and streamline data ingestion and modeling processes across their operations. With extensive data coming from various sectors, there was a pressing need for a robust, enterprise-grade solution. This centralized framework would serve as the single source of truth for all organizational data. It would ensure consistency, accuracy, and security in data handling.

 

The challenge

Establishing a central framework for data management posed significant challenges. Ensuring data security both at rest and in transit was a top priority. The organization needed to set up strict access controls, consumption patterns, and security standards to protect sensitive information. Managing data from isolated sources, each with different security requirements and access protocols, added complexity. Without a unified solution, the organization faced inconsistencies, potential security risks, and inefficiencies in data management.

 

Personas impacted

Several key roles within the organization were impacted. These included data modelers, data engineers, data scientists, data stewards, and product owners. Each role required secure and efficient access to data, tailored to their specific needs and security clearance levels. The lack of a centralized framework led to challenges with isolated data sources and disparate solutions. This caused delays and potential errors in their work.

 

The ask

The organization required a scalable and secure solution for data ingestion and consumption within their enterprise data foundation framework. The solution needed to integrate with their identity management system, enabling secure access to data based on users' security classification levels. Supporting multiple platforms, including Synapse, Databricks, and Data Lake, while maintaining stringent security controls, was essential.


Tackling the task

To address the requirements, an in-depth analysis of the existing data infrastructure was conducted. The approach involved the following aspects:

·        Exploration: Potential solutions for securing data at rest in storage accounts and data in transit through platforms like Databricks and Synapse were explored. Collaboration with Microsoft’s product team helped identify security design limitations and customization opportunities within the platform’s role-based access control (RBAC).

·        Design: A solution was designed to integrate the organization’s identity management system with Azure AD groups, allowing for seamless and secure access to data. This included implementing nested groups to enable access inheritance based on business requirements.

·        Implementation: Custom roles in Azure were created to limit the actions an identity could perform. These roles were assigned to AD and Azure Security groups. A conditional access policy was also developed. This policy allowed data access based on specific metadata and organizational structure. To further improve security, tables in Synapse were organized under schemas based on security classification. Read access was also only granted to the appropriate groups.


Challenges and solutions

Challenges arose during the implementation. Customizing the cloud platform’s role-based access control and integrating it with the organization's complex identity management system proved difficult. These challenges were overcome by developing custom solutions. Close collaboration with Microsoft’s product team ensured secure and scalable access to data.

 

The solution and outcome

The final solution provided the organization with a centralized, secure, and scalable data framework. Key components of the solution included:

·        Integration of identity management with Azure AD groups and Azure Security groups.

·        Implementation of custom roles and permissions to ensure data security and compliance.

·       Streamlined access setup with minimal manual intervention, reducing errors and improving efficiency.

·       Conditional access policies based on security classification to prevent oversharing of data.

 

Results

The implementation of this solution brought impactful benefits:

·        Consistency: The centralized framework ensured consistency across various applications, reducing discrepancies and improving data accuracy.

·        Efficiency: Automated access provisioning reduced time and manual effort, allowing teams to focus on more strategic tasks.

·        Security: Improved data security through role-based access control and security classification-based access, minimizing the risk of data breaches.

 

Future outlook

With this robust and secure framework in place, the organization is well-positioned to onboard more sectors into their central enterprise framework. This will enable more comprehensive data cataloging and further improve the security and efficiency of their data management processes. The solution has not only addressed the initial challenges but has also set the stage for future growth and innovation in the organization’s data strategy.