Friday, February 15, 2019

Case Study: Improve Customer Feedback Analysis with Azure Databricks



Key Challenges

   Transition feedback analysis architecture from VMs to Azure Databricks.
   Improve analytics execution speed and scalability.
   Add entity recognition and key phrase extraction services.

Fast and Accurate Feedback Analysis is Crucial

Tracking customer sentiment is an essential business activity. Customer feedback lets businesses know which efforts are working and highlights customer pain points. More significantly, understanding consumer desires enables predictive action. If every customer who enters a store asks for a certain product, the store owner knows that she should stock more of the product for the following week. But by tracking customer feedback, the store owner can dig deeper and understand why customers are demanding the product. If the store owner determines the all-important “why,” she will know whether the increased demand was due to global consumer trends, a marketing campaign, a celebrity endorsement, or any number of other reasons. In other words, customer feedback allows businesses to pursue insights they would otherwise not be aware of.

Our client, the voice of the customer team for a large software company, wanted to improve their text analytics system. The client’s system relied on VMs to compile online customer feedback and perform sentiment analysis. To improve execution speed and increase scalability, the client wanted to move the system to a serverless architecture. The client also wanted to incorporate two new features: entity recognition and keyphrase extraction.

Our Process: Benefits of Azure Databricks

The client’s previous feedback architecture used Python scripts to process customer feedback. During processing, contractions were expanded, inflectional endings were removed, HTML tags were removed, punctuation marks were removed, characters were rewritten in lowercase, spelling mistakes were corrected, and junk words were removed from the feedback. Sentiment analysis was then performed on the cleaned data. The system was functional but slow and non-scalable compared to modern serverless solutions.

We knew that Azure Databricks would offer our client the exact kind of speed and flexibility that they were looking for. Azure Databricks allows users to run robust analytics algorithms and drive real-time business insights. It offers one-click, autoscaling deployment that ensures that enterprise users’ scaling and security requirements are suitably met. Azure Databricks also features optimized connectors, which we used to run Microsoft Cognitive Service APIs. This allowed our team to quickly implement entity recognition and keyphrase extraction. And because the Azure Databricks solution was managed from a single notebook, our teams could collaborate more effectively across office locations. Now, when our India team begins working on contraction processing, our team in Redmond can continue with lemmatization without missing a beat.

Immediately Put to the Test

We completed our client’s new Azure Databricks-based feedback analysis implementation just in time for the biggest shopping test of the year: Black Friday. As it turned out, feedback analysis was crucial in ensuring a smooth purchasing process for customers.

Shortly after Black Friday sales started, the client’s online checkout tool began having technical difficulties. Due to the overwhelming number of online transactions, the checkout tool failed. No matter how many times customers reloaded the checkout page, customer transactions could not be completed. Customer support lines were inundated with calls from holiday shoppers, so the client was initially unaware of the problem. Fortunately, the customer feedback tool immediately compiled, analyzed, and made information about the checkout issues available. The voice of the customer team forwarded the feedback information to the technical team, and the problem was addressed quickly.

Improved Feedback Analysis Leads to a Better Customer Experience

Our Azure Databricks feedback analysis tool improved speed and brought a new level of scalability to our client’s business operations. As evidenced by the Black Friday feedback, the tool’s speed and accuracy was a success right from the start. The technical improvements from the Azure Databricks architecture resulted in the ideal business outcome: the discovery of actionable business insights, faster. Ultimately, of course, improved insights mean a better customer experience, which is crucial to any business’s success.

Thursday, February 14, 2019

SAFe at MAQ Software




Since our inception, we’ve used agile methodologies to deliver software solutions that address our clients’ business needs. These methodologies—in their purest form—work perfectly for small projects. Agile methodologies deliver high business value by ensuring that projects respond to changing business needs. These methodologies result in functional software, delivered early and frequently, with short release cycles.

As the complexity and quantity of our projects have grown, however, we’ve recognized an increased need to accommodate interdependencies between projects. Agile methodologies work best with small teams (five to nine members). Projects with larger teams—especially interdependent projects—increase the complexity of execution. With larger teams, the benefits of agile methodologies (such as osmotic communication and uniform knowledge within teams) are difficult to achieve. Agile ceremonies (daily scrums, sprint planning, etc.) are also often difficult to manage when team sizes increase.

With the increased size and complexity of our projects, we knew we needed to avoid relying on centralized planning when executing interdependent projects. This would have resulted in a fallback to the waterfall methodology, which would have made any response to change difficult. Businesses provide a high-level roadmap for projects. With the waterfall method, project teams attempt to nail down specific business requirements from the start, but the waterfall methodology is ill-suited for dynamic environments. To efficiently handle increasingly complex projects and avoid waterfall methodologies, we needed to create an optimal level of central planning and coordination that retained the benefits of agile teams.

As we sought a balance between central planning and agile methodologies, we quickly realized that teams need to coordinate at various points during the sprint to eliminate redundancies and better cater to interdependencies among deliverables. With interdependent projects, it is crucial to align with the common business goals of the organization, portfolio, or program they belong to. While team-level autonomy is important, it is equally important to align teams to an organization’s objectives and to deliver projects that are part of a long-term roadmap. We determined that teams must take advantage of common infrastructure, identify common requirements that can be implemented and used by multiple teams, identify dependent requirements so that teams can hand off from the producer to the consumer in a timely manner, and synchronize deployments so that integrated solutions work seamlessly.

As part of our current project workflow, we create release roadmaps that broadly define the objectives of the organization, portfolio, and program for the next 6 to 18 months. Individual agile teams then plan sprints for the next two to four weeks, aiming to incrementally achieve the objectives of the planned roadmap. Project owners collaborate closely with the agile teams during the duration of the sprints to help them achieve broad goals while implementing course corrections as necessary. This close collaboration is the primary distinction between our methodologies and the waterfall model. We encourage collaboration between project owners and agile teams, whereas waterfall planning often occurs at a detailed granularity that minimizes collaboration with stakeholders.

These evolved methodologies closely resemble the Scaled Agile Framework (SAFe) described here. The practices of cross-team cadence and synchronization help multiple agile teams of a portfolio or program advance together. Lean-agile principles minimize waste by avoiding replicated efforts and promote the reusability of capabilities built by any of the teams of the portfolio.

Retaining the characteristics of agile teams ensures that the teams are self-organized and self-managed. They are also able to release software in shorter cycles with continuous customer collaboration.

The SAFe framework retains many benefits of agile methodologies while scaling to accommodate complex, interdependent projects. This is accomplished through the following:

   DataOps practices help deliver valuable software to customers faster and more effectively.
   System demos ensure that the software developed by the constituent agile teams all work together as a system, thereby delivering value to the portfolio.
   Agile release trains ensure that the synchronized release of software across agile teams is smooth and efficient.
Figure 1: A representation of SAFe DataOps practices in our context.
We have found SAFe to be a viable framework for supporting the business needs of our customers in a dynamic business environment. The ability to balance the autonomy of the agile teams while ensuring the alignment of projects with the organizational objectives has been the key differentiator of SAFe. The union of systems thinking, agile principles, and lean principles results in a highly efficient and scalable framework for delivering software at scale for our large enterprise customers.

Thursday, December 13, 2018

Case Study: Sales Targeting at all Levels



Key Challenges

   Create an improved workflow for managing sales targets.
   Create a web-based data application that sales managers and executives can access and modify.
   Incorporate new functionality into the already-familiar Excel environment.

A Better Way to Track Sales Targets

Sales targets, perhaps more than any other business metric, reflect the intersection of the existing status of your business with what you hope to achieve. “How much of our product can we sell this quarter?” is a question that spans the entirety of previous, current, and planned efforts. Engineers design and build products, product managers improve products’ appeal and capabilities, and marketing activities attract new clients. The sales team is directly responsible for transforming all this labor into tangible gains. But at the center of all this, well, work, is the fact that running a successful business is about more than just producing a great product—it requires concerted effort and successful communication from individuals across multiple business divisions. Sales targets are a crucial final form of communication before a product is sold to a customer. Sales targets inform management and sales teams of performance expectations, allowing the management team to set clear goals and the sales team to meet those goals.

Our client needed a system that would allow their sales teams to upload sales targets for different products across varying geographies and segments. Their existing solution, a collection of Excel spreadsheets, was a major pain point. To broadcast sales targets across the company, sales managers and sellers recorded sales targets into spreadsheets. This process was repeated until all the sales targets were recorded, then the spreadsheets were collated and consolidated. Next, the spreadsheets were manually dumped into SQL sources without an approval workflow. This resulted in several challenges for our client. First and foremost, our client did not have the technical capability to directly push data to their data sources. More importantly, they could not manage their sales target workflow via data sources. Our client’s sales managers and sellers needed to be able to contribute to the workflow so that their sales targets could be authenticated and directly approved by the sales managers and upper management. As a result, our solution needed to convert the data from the Excel spreadsheets into a web-based application that sales managers and executives could access and use.

Our Process

Because our primary goal was to enable sales managers and sellers to upload and view sales targets across different geographies and segments, the usability of our solution was of the utmost importance. We needed to equip sales managers and sellers to individually authenticate their sales targets on a platform accessible to all. This, in turn, meant that whatever solution we proposed needed to immediately improve the everyday working conditions of the sales managers and sellers. Our solution needed to be intuitive, and adoption had to be instantaneous. We also knew our client’s sales managers and sellers already used Excel every day. As a result, we developed an MVC framework Excel Office app that utilized a SQL backend.

Challenges of Scale

Our initial implementation was tremendously successful. It was this success, in fact, that led to the most significant challenge that accompanied the project. We designed our solution explicitly for subsidiary-level sales managers, but following our implementation of the app, our client requested that we expand access to the app to include sales unit managers. When these managers attempted to examine multiple geographies and segmentations within the app, reports took far too long to load. To address this issue, we customized the amount of data that users needed to load depending on their job role. Now, sales managers were first presented with a report customized to their job role and geography, though they could still generate reports uniquely suited for their needs if a project took them beyond their typical responsibilities.

Empowered Sales Managers

Our Excel app helps 6,000 users upload their sales targets every day. The feedback we received on the project gives us a great deal of pride: “Your app empowered our sales managers and sellers to manage their targets effectively, and it immediately made their lives easier.” Because we are a consulting company, all our solutions require us to understand our clients’ problems as our own. The fact that our solution immediately made our client’s sales managers working lives better meant that we were well in tune with our client’s needs.

Monday, December 10, 2018

Case Study: Creating Custom Reports Has Never Been Easier



Key Challenges

   Enable users to build reports using multiple business intelligence (BI) sources.
   Create filters to set global parameters that are reflected in all reports.
   Create structured report navigation.

Dealing with the Proliferation of BI Platforms

Organizations increasingly rely on BI reporting to analyze up-to-the-minute sales and marketing data. Most organizations address their analytics needs on a piecemeal basis. It’s not uncommon for companies to develop reports on HTML5, Power BI, Tableau, and other platforms simultaneously. The variety of platforms makes creating custom reports difficult. For example, if an account executive has a critical meeting and needs visuals from multiple reports, he typically must rely on programmers to pull the information he needs and generate a new custom report.

We created a tool that allows non-programmers to create custom reports, regardless of the source of the data. With our Power BI Global Filter, users can easily navigate between reports on various platforms, create custom reports, and filter the results to display only the data they need. Our Global Filter tool screens reports based on the job role selected, making relevant data easy to find. Our tool even allows users to share default views of reports so that the receiver immediately knows what information to focus on.

Using Our Power BI Global Filter

Users begin by selecting their job role (Figure 1). Global Filter then refines the results, displaying only reports that are relevant to the selected role.

Figure 1: Role selection prompt
Users can select specific reports using the navigation menu on the left side of the app (Figure 2). This customizable menu allows users to quickly navigate report hierarchies. Once the user selects a report, it is displayed in the right-hand frame.

Figure 2: Navigation screen
Users can also create collections of reports that are tailored to specific roles or work areas (Figure 3). For example, let’s say a user selects a filter to only display sales data for the Pacific Northwest. Once applied, only sales data from the Pacific Northwest will be shown on all reports in that collection. Pacific Northwest sales data is then shown, regardless of which reporting platform the data originates from.

Figure 3: Navigation customization
Finally, users can create custom reports (Figure 4). This feature allows users to select visuals from any report, regardless of BI platform, and compile it into a single source. For example, a user could select one chart from a Power BI report and another from a Tableau report and combine them in the same custom Global Filter report. Users can also add unique metadata to reports detailing the data source, user concerns, or other pertinent information.

Figure 4: Custom report creation

Create Reports Without Compromises

With our Global Filter, you can reduce the time it takes to find reports, apply global filters to edit reports more easily than ever, and even create custom reports to suit your specific business needs. You’re no longer restricted by visuals that are located on disparate reporting platforms. Global Filter lets you gather the exact information you need without complex coding, saving you time and giving you complete control over your analytics visualization.

Thursday, November 29, 2018

Case Study: Connecting Databases Across Business Divisions



Key Challenges

   Merge customer identities that are siloed in various databases.
   Create “golden records” that allow division-level databases to identify specific customers.
   Allow administrator access to override data conflicts.

Abundant Data Means Complexity and Redundancy

In ever-changing business environments, systems and databases must evolve to meet increased demands. Business growth inevitably leads to increasing database complexity and often necessitates infrastructure improvements. This new infrastructure requires data to be transitioned and reentered, leaving room for clerical mistakes. As a result, records struggle with inconsistency and redundancy between the various departmental databases. Ultimately, this means that organizations are not able to provide the highest level of service to their customers. Master data management (MDM) is a process that creates a master-level dataset of information, allowing interdivisional communication without the need for costly integration of division-level datasets.

Our client, a large retailer, needed a way to merge customer identities that were siloed within databases in different divisions. The company’s numerous business divisions sold a variety of products. But because they could not track customer identities across business divisions, sales and marketing teams couldn’t coordinate activities. Customers who had just purchased a product still received promotional emails for the very same product. Product sales information was not utilized for future sales and marketing efforts. In some cases, customer records were even duplicated within the same department. The systems hosting the departmental datasets were tailored to the specific needs of each business division, so unifying these systems directly would have been prohibitively expensive.


Our Process

Our client hired us to improve their database accuracy and efficiency. We started by creating reporting copies of each dataset. The next step was to create “golden records”—master customer identities that enabled division-level databases to identify unique customers, despite a multitude of duplicate stored identities.

To create the golden records from the various customer identities siloed within the different divisional databases, we extracted each of the ten divisional databases onto the reporting platform, then cleaned and profiled them. Next, we matched the various records using customized rules, fuzzy lookups, and machine learning procedures. We then deleted duplicate records and standardized and consolidated customer identity values. Machine learning algorithms then identified similar records and linked them, allowing one record to be identified as the master record amongst a group of similar records. We then attributed the master record to the customer identities within the siloed divisional databases. Once mastered, the data was made available for use with the existing systems.

This entire process is now automated and occurs on a recurring basis. Golden records are added and refined to keep pace with changes in the transactional data. While the solution takes advantage of data about customers across various systems to improve understanding of the customer, data security and privacy requirements have been observed. The solution is entirely compliant with the regulations established by the General Data Protection Regulation (GDPR).


Although the process is now automated, we built administrative modules that allow users to address cases where data cannot be properly processed through machine intelligence. Matching rules, thresholds, and other system parameters can also be managed within the administrative module.

Unified Customer Identities Across Divisional Databases

Mastering data from siloed databases allowed our client to clean the raw data, apply business rules to validate the data, and better track customer experiences across business divisions without integrating division-level datasets. The improved data management led to improved customer service. Using our master data management system, marketing and sales can now better address customer needs. Now, customer’s purchases are accurately tracked, and relevant information is sent to them regarding promotions or offers.

Monday, November 26, 2018

Case Study: Achieving Long-Term Goals Means Making Incremental Changes



Key Challenges

   Integrate two SharePoint portals that are used daily by thousands of team members.
   Incorporate existing third-party applications into the integrated portal without increasing costs or resource usage.
   Implement incremental changes to minimize downtime.

Balancing Capabilities, Costs, and Cohesion

If you are not part of an IT company, odds are your content management system (CMS) relies at least partially on third-party applications. Third-party applications are not necessarily a bad thing—they provide many beneficial CMS services. As the number of third-party applications increases, however, integration costs rise and systems become fragmented. Companies must strike a careful balance between increasing capabilities (and costs) and maintaining a well-integrated system. Development often must take place incrementally. We help our clients navigate the tradeoffs associated with incorporating new CMS functionality to increase performance one step at a time.

One of our clients, a large retailer, approached us in 2015 to make incremental changes to their SharePoint-based employee portal. Their portal allows employees to read the latest company updates and access required third-party applications. The client independently owns several thousand stores. Employees at these stores accessed the portal with no issues. But the client also needed to share company updates and application access with their licensee stores.

Maintaining consistent customer service and brand image, regardless of whether a customer visits a partner store or a licensee store, is crucial to our client's success. To ensure consistency, the client was responsible for training and educating the employees at the partner stores and the licensee stores. But because our client did not employ the workers at the licensee stores, the client did include them in the existing employee portal. Instead, the licensee store employees received company information through hard copies and emails. Our client needed to be able to onboard new employees at their licensee stores and provide them with company updates while maintaining a separate SharePoint portal for their full-time employees.

Our Process

In order to centralize licensee store resources, our client asked us to build a licensee employee portal. Ideally, the client wanted to be able to push updates and information simultaneously to the main employee portal and the licensee employee portal. We began by cloning the full-time employee portal onto a separate system. This ensured that our client was able to house all the information needed for the employees of the licensee stores, and at the time, this was all our client wanted. But this solution introduced a logistical problem. Both portals naturally underwent incremental change as a result of their slightly different roles, but our client still needed the portals to perform the same functions. Because our client was using two entirely different portals, updates meant the same code had to be written twice. To increase the efficiency of the implementation, we generalized the logic that ran both portals. Now, changes made to both portals were pushed to both concurrently.

Challenges Along the Way

Working in a production environment that extensively relied on third-party applications presented many unique challenges. Due to the size of our client’s SharePoint farms, for example, we could not load the entire production environment in lower testing environments. Even though we used proxies to reproduce errors, we could not be certain that bug fixes would be successful until they were implemented in the production environment. This challenge was then exacerbated by the size of our client. Our client is responsible for over 10,000 stores. This means that any proposed changes to the SharePoint farms must go through a lengthy vetting process of at least three weeks.



We knew that for a project of this scale, mistakes or bugs were unacceptable. Any mistake that resulted in downtime could interrupt the workflow of thousands of company and licensee employees. Over our years-long relationship with our client, we worked hard to develop trust and demonstrate our competence through reliable results. Reflecting on the scale and duration of the project, the project lead emphasized the importance of patience. “We cannot be impatient, and we must be very thorough in our work. If we say, ‘this is the hotfix,’ we have to be absolutely certain in our work so that it doesn’t backfire.” Our diligence and attention to detail paid off. “We had to prove that we were the best,” our project lead noted, “but as long as you are technically capable and thorough in your work, I think the work speaks for itself.”

Tuesday, November 20, 2018

Case Study: Boosting Black Friday Sales



Key Challenges

   Create automated emails and a dashboard allowing leadership to analyze real-time sales and promotion data.
   Combine multiple database platforms into a single information portal.
   Implement secure access allowing only authorized personnel to view the sales data.

Black Friday Sales Created Unprecedented Data Volume

Our client is a business strategist in charge of his company’s brick-and-mortar and online stores. His company wanted to coordinate promotional campaigns with near real-time sales performance during the busy Thanksgiving shopping season. We’d worked with our client for many years, and this latest project offered our client and ourselves a real chance to excel. In 2016, we had helped him automate retail reporting, which allowed reports to be delivered every fifteen minutes. Now, our client was tasked with automating the reporting of all sales data—a collection of data that was 100 times larger than the reports we had automated the previous year. With just two months until Black Friday, we hurriedly set to work.

Our Process

From the outset, we knew this project presented unique challenges. Each retail division had a different reporting system. The brick-and-mortar stores relied on custom on-premise databases. But due to the sheer volume of the data we needed to collect, we knew that we could not process and report data at this scale using the on-premise solutions. We would need to use a suite of Azure technologies. These technologies, however, came with their own set of problems. At the time, Azure Analytic Services was not yet out of preview. Many of the features we needed lacked documentation. With these unknowns in mind, we conducted a series of proof of concepts over a period of three weeks. We then vetted our assumptions, communicated the results with our client, and determined our final architecture design. At the time, no one had hosted a project of this scale within Azure services before. We were working in uncharted territory, but our client had complete confidence that we would deliver.

We began implementation immediately after conducting our proof of concepts. We initially had some challenges getting data at the rate we needed. We had access to real-time sales reports from retailers and online stores but did not have access to these reports from two specialized business divisions. “A lot of my time was spent reaching out to folks who managed that data,” our project lead observed. “Once their teams saw the initial version of the build, they were willing to give us access to the transactional data because they understood how successful they could be with our implementation.” Once again, our Agile methodology proved crucial throughout this entire process. Because we submitted deliverables immediately upon completion and finished the implementation ahead of schedule, we were able to win the trust of both specialized business divisions in the span of a few short weeks, despite having never worked with them before.


We finished development and basic testing two weeks before Thanksgiving. We continued running end-to-end tests improving our implementation until the Wednesday before the holiday. Our teams were absolutely committed to the successful closure of the project: “The weekend before Thanksgiving, the entire team and I worked very hard. I don’t know how much sleep I got those five days, but I can tell you this: I was awake most nights,” our project lead said with a laugh. That week, our teams busily worked, simulating the data reporting load of the Thanksgiving shopping season. Finally, it was time to put our implementation to the test.

The first three days of the holiday weekend passed without a hitch. But on the fourth day, a minor setback arose. Because we were pulling the data in near real time, we were creating a lot of files. We were creating roughly fifty files every thirty seconds, and the data reporting job started running slower and slower. Initially, this was not a problem, but when the job started taking over twenty-five minutes, our engineering team stepped in. If the data reporting job took over thirty minutes, it would eat into the timeframe allotted for the next data reporting cycle. This would have then disrupted all the reports.


Our team first implemented a quick fix that compiled the files as they came in. This fixed the setback for a day. By the next day, the team reached a permanent solution; files were defragged and only then compiled.

Improved Sales Reporting Without a Hitch

Our work resulted in a tremendous triumph for our client. Despite the technical challenges we faced during the Thanksgiving festivities, there was no impact on end users. Each of the 125 email reports was sent to promotional staff on time. Our client made five promotional decisions based on the reporting that we did, and each decision resulted in improved revenue results. Our reports were widely used; the dashboard that collated the sales reporting data ranked 100 out of 75,000 dashboards used by the organization.