Monday, February 25, 2019

DataOps at MAQ Software




For nearly a decade, we specialized in building, delivering, and operating business intelligence and data analytics solutions. During this time, we observed significant growth in data volume and complexity. The increased complexity results from two primary causes: 1. Metric definitions use calculations with multiple data points. 2. The variety of data points increased over time. We also observed an increasing number of interdependencies between projects. Interdependencies are a result of solutions combined or built on top of one another.

Despite the increased volume, complexity, and number of interdependencies, modern business practices require developers to create and deploy data-based solutions faster than ever. To deliver projects, we adopted agile methodologies, DevOps practices, and statistical process control. Together, agile methodologies, DevOps practices, and statistical process control form what we call DataOps.

We embraced agile methodologies at our inception. Today, we continue to deliver business intelligence and data analytics projects using agile methodologies. Incremental delivery—a core agile practice—enables us to deliver business value to our customers early in the development cycle. Agile methodologies are useful for early requirements, which are often difficult to ascertain. Incremental delivery allows our customers to continuously refine their requirements. We found close customer collaboration and agile practices consistently produce successful projects.

For faster delivery, we use DevOps practices. We automate and streamline the development, validation, integration, and deployment of data solutions. By introducing automation to the development life cycle, we reduced the time needed for data solutions to reach production. We can push changes to production on demand, with minimal human intervention or mistakes. Automation significantly reduced the cost of releasing incremental changes. We can now issue several releases to production every day. In many cases, due to increasing interdependencies, automation is necessary.

More recently, we improved the efficiency of live data pipelines by creating ongoing alerts. Our monitoring mechanisms are automated test cases that run at each processing stage of the data pipeline. Because data is continually processed at various stages of the pipeline, check-gates must prevent incorrect data from disrupting the system. Missing data, excessive data volumes, and wide variation in key metrics are all red flags that prompt timely alerts. Ongoing alerts trigger mechanisms that prevent the flow of data through the system. Monitoring and control mechanisms help maintain the quality and integrity of the data in the live data pipelines. Because of our ongoing alerts, customers can manage their day-to-day business operations with confidence.


Using agile methodologies, DevOps, and statistical process control techniques led to tremendous benefits for our customers. Agile methodologies provided the flexibility and speed required to compete effectively in dynamic business environments. DevOps practices helped teams overcome the traditional bottleneck of deploying solutions to production. DevOps practices also shortened the time from conception to deployment and improved integration and deployment. DevOps practices also resulted in the ability to move small changes to production more frequently. Moving small changes to production more frequently minimizes the risk of regression issues and the resulting downtime. Statistical process control techniques ensure live solutions continue to operate as expected.

The combination of agile methodologies, DevOps, and statistical process control techniques evolved into DataOps. DataOps combines highly proven methods of software development, delivery, and operations. DataOps is driven by the need of businesses to unlock the value of their data assets in a timely, reliable, consistent, and continuous manner. With DataOps practices, development time decreases despite increases in volume, complexity, and the number of interdependencies in modern data solutions.

DataOps is the result of methodologies designed to handle complex data requirements with increasing efficiency. By adopting DataOps methodologies, we improved our processes and increased the value we deliver to our customers. As data processing demands become more complex, we will continue to pursue the most efficient means of data processing, support, delivery, and operations.