Monday, July 25, 2022

Real-time reporting for an organic supermarket chain

Business Case

Our client, an organic supermarket chain, sells affordable natural foods to thousands of shoppers every day. To better understand customer demand, teams relied on multiple daily sales reports. However, their existing analytics platform offered limited visualization and self-service capabilities. To build reports, teams exported data from the platform to Excel, where they manually organized and analyzed data – taking as long as 15 minutes to create a single report. The process was time-consuming, error-prone, and offered little return on investment (ROI). Knowing they needed a change, the supermarket giant began working with another supplier to migrate to Microsoft Power BI, a leading reporting and visualization platform. But when the supplier failed to deliver a solution on time, Microsoft introduced us to accelerate their migration. Over the next few months, we worked closely with our client’s teams to understand their needs and deliver an effective solution.

Key Challenges

  Enable self-service reporting 
  Empower real-time insights 
  Enable holistic view across teams 
  Reduce operational costs 

Our Solution

Our client knew they wanted Power BI, but supporting the speed and flexibility they needed also required a powerful analytics platform on the back end. We performed an in-depth analysis and determined the best combination for our client’s needs was Power BI and Azure Synapse. With this combination, we could perform direct queries between front-end Power BI reports and back-end data, enabling teams to access insights in real time. In addition, Azure Synapse provided higher data security than their existing platform. With Azure Synapse’s full data protection, access control, authentication, network security, and threat protection, teams can work with more confidence.

After migrating all data to Synapse, we built a robust data model that provides a holistic view across teams and sales channels. Now, teams can easily compare in-store purchases and ecommerce orders. Using our technical expertise and business knowledge, we helped teams create reports that maximize their insights. With Power BI’s slice and dice capabilities and drag and drop visualizations, teams can now edit and build reports with zero code. To ensure teams were confident using their new platform, we offered Center of Excellence (CoE) trainings and dedicated support hours.

Business Outcomes

Within three months, we migrated our client’s reports to Power BI and built a robust data model on Azure Synapse that provides a holistic view across teams. Now, report generation is automatic, collectively saving teams up to an hour a day. With a unified system and no manual effort, our client reduced their operational costs and increased their ROI. Teams can now easily access insights and be confident in their business decisions. 

Thursday, July 7, 2022

Improving insights and reducing costs through unified reporting


Business Case

Our client, a leading office retailer, enables millions of customers to work more productively. To track sales, marketing, finance, and operations, teams relied on over 100 reports. The challenge? Reports were siloed across three different legacy platforms, which were no longer supported by the supplier, or too expensive to maintain. In addition to high licensing costs, teams required engineering support to edit or create reports – hindering high-level resources and increasing operational costs. On the back end, reports pulled from six separate data sources, leaving teams with inconsistent insights and stalled decision-making. To improve reporting and business impact, they needed a unified reporting platform and data model.

Key Challenges

  Migrate/consolidate 100 reports to single platform  
  Build unified data model 
  Improve data reliability and availability 
  Enable self-serve reporting 
  Reduce operational costs 

Our Solution

We migrated all reports to Power BI and built a data model that pulled from Snowflake. Using our proven migration strategy, we worked with the sales, marketing, finance, and operations teams to understand the metrics they needed and the gaps they faced with current reports. Defining key points and consolidating shared metrics enabled us to create a single data model that supported needs across teams. In two months, we migrated all reports from teams’ various analytics platforms to Power BI. To optimize our workflow, we followed an Agile methodology comprised of the following steps:

1. Sprint Plan: Create a product backlog and define the scope and length of sprints.

2. Implementation: Using best practices, reusable templates, and themes, migrate reports and provide incremental report builds.

3. Performance Tuning: Refine the architecture and report layout to optimize the data model for performance.

4. Testing: Use a set of in-house performance analysis tools to automate testing, which tracks query performance and suggests visual layout and data validation optimizations. In addition, conduct UAT sessions to ensure the reports are user-friendly, high-performing, and optimized for their target audience.

5. Deployment: Automate deployment, enabling users to immediately access reports. Complete the transfer of ownership – hand off the code, reports, and workspace inventory to our client.

6. Decommissioning: Avoid redundancies by systematically retiring old reports without impacting ongoing business operations.

Once we completed migration, we trained teams in Power BI through our Center of Excellence (CoE) programs. Through these trainings, teams learned the best practices they needed to confidently build and edit their own reports.

Business Outcomes

In less than two months, we migrated 100 reports to Power BI and built a unified data model connected to Snowflake. With our solution, our client has reduced their operational costs, unified reporting across their organization, and enabled self-serve reporting. Now, teams can act on insights 80% faster and confidently build the reports they need. 

Tuesday, July 5, 2022

Microsoft Azure Advisor: Everything You Need to Know


What is Azure Advisor?

Advisor is an optimization tool that analyzes your Azure environment and recommends improvements for performance, reliability, security, cost, and operational excellence. Advisor offers user-friendly dashboards and tools that enable you to maximize your insight and take action. Advisor is free with your organization’s Azure subscription and is available through your Azure portal.

Business Benefits

For many large enterprises, manually analyzing Azure resources to ensure optimal performance is time-consuming and error-prone – especially if you don’t know what to look for or how to fix it. Advisor automatically identifies problem areas in your Azure environment and offers personalized optimization recommendations, so you can follow best practice. With Advisor, you can:

  Save time by automatically identifying gaps and improving resource performance 
  Reduce costs by identifying idle and underutilized resources so you can scale appropriately 
  Strengthen security by identifying gaps 

Features

Overview Dashboard

The overview dashboard provides you with a quick glance of your active recommendations and their impact (high, medium, low), grouped by category: reliability, security, performance, cost, and operational excellence.

Advisor Score

Your Advisor score (displayed as a percentage) represents your environment’s overall performance. For example, a score of 100% indicates you have implemented all best practices. Alternately, a score of 50% indicates you can improve your Azure performance by implementing more best practices. Your Advisor score refreshes every 24 hours and is the sum of your category scores divided by the sum of the highest potential score from each category. In total, there are five categories: reliability, security, performance, cost, and operational excellence. Your applicable categories vary based on your active subscriptions.

Recommendations

Advisor provides personalized best practice recommendations based on the five pillars of the Microsoft Azure Well-Architected Framework:

  Reliability 
  Security 
  Performance 
  Cost 

Each recommendation is paired with potential benefits, potential score increase, and impacted resources.

Filters and Grouping

For added control, you can filter dashboards by subscription, recommendation status, and recommendation type. If you have multiple subscriptions, you can group dashboard insights by subscription.

Monitoring

To stay proactive, you can set up alerts when new recommendations are detected for your resources. Using a variety of configurations, you can prioritize subscriptions, resource groups, categories, and level of impact. You can choose to receive alerts via email and text message, or automate actions using webhooks, runbooks, functions, logic apps, or by integrating with external ITSM solutions. In addition, you can set up recommendation digests, which provide periodic reports your active recommendations. For more about how to set up alerts, review this Microsoft guide.

Downloading

To share insights, you can download score/recommendation reports as a CSV or PDF file.

Getting Started

1.    Log in to your Azure portal.
2.    On the Azure homepage under Azure services, select the Advisor icon.
Note: If you do not see the Advisor icon, type “Advisor” into the search bar and select Advisor from the search results.

Thursday, June 16, 2022

Enhancing fintech analytics to provide millions of borrowers with better loan options


Business Case

Our client, a leading fintech company, enables thousands of financial institutions to engage millions of borrowers with better loan options. Our client was on a mission to expand their analytics platform when they faced a critical block: Their existing platform architecture was at maximum data capacity. To onboard new customers, our client needed a more scalable analytics solution. In addition, our client wanted to enhance their platform’s reporting experience. Existing reporting was limited and required users to export data to Excel for manual analysis, delaying insights. To increase their product value and onboard more customers, our client needed a scalable architecture with embedded reporting.

Key Challenges

  Enable analytics platform to scale to 1000+ customers 
  Enable self-serve, near real-time analytics 
  Enable AI/ML capabilities for future innovation 
  Improve security of financial data 

Our Solution

We rebuilt our client’s analytics platform using Azure Synapse, Azure Data Lake Storage, Azure Data Factory, Azure Databricks, and Power BI. To ensure operational and technical excellence throughout the build, we followed the five pillars of the Azure Well-Architected Framework and leveraged migration strategies from Microsoft’s Cloud Adoption Framework.

Reliability: Implemented query replica within Azure Analysis Services (AAS) to ensure resource intensive queries do not impact ETL processing. Configured secondary and backup resources to ensure 100% resource availability.

Security: Enabled role-based access, disabled public access to storage accounts with PII data to ensure partner data is isolated within the ecosystem. In doing this, we greatly reduced the risk of security threats.

Cost Optimization: Implemented auto-scaling in lower environments, enabled Databricks to scale down when inactive, and deployed Power BI report on cost monitoring to scale services as needed.

Operational Excellence: Created Terraform automated scripts for Azure resources deployment. Implemented proactive monitoring for pipeline bottlenecks, ETL execution, and failures.

Performance Efficiency: Implemented parallel processing and concurrent querying of underlying data model for 1000+ customers using Azure Databricks.

In addition, our automated deployment framework uses continuous integration/ continuous delivery (CI/CD) pipelines to create Azure landing zones by focusing on identity, network, and resource management. To deploy Azure landing zones, we used a proprietary approach that combines the benefits of both the “start small and expand” and the “enterprise-scale”. Using industry-standard best practices and our center of excellence for Azure infrastructure setup, we ensured the right configuration to build a strong foundation and avoid rework in the future. This approach reassures our customers about our capabilities while creating a secure and reliable environment that is built to last.

Business Outcomes

With our Azure Synapse-based solution, our client’s platform now offers powerful self-service, near real-time analytics, enabling their customers to reach millions of borrowers faster. The platform now has the capacity to scale and support over 1000 customers. With Azure Synapse, our client can easily integrate machine learning models like fraud detection and recommendation engines without major architecture changes. To accelerate onboarding, we developed an automated deployment framework that onboards new customers in a single click, reducing setup time from days to hours.  

Friday, December 31, 2021

Accurately Forecast Customer Sales with Machine Learning (ML)



Business Case:

Our client, a multinational food and beverage chain, operates thousands of retail stores and generates billions of dollars of annual revenue. Our client needed to understand the impact of weather, promotions, discounts, product launches, holidays, and other events on sales. The client’s existing predictive sales model routinely underestimated sales volume at both the aggregated and daily level. Our client also needed to better understand the causes of seasonal and daily spikes in sales.

Key Challenges:

  Improve the accuracy of future sales predictions. 
  Identify and analyze patterns in data for nonlinear fitting and predict future sales using historical data. 
  Examine the correlation between weather data (precipitation, temperature, pressure, wind speed, cloudiness, and so on) and sales at a specific longitude and latitude. 
  Analyze the impact of factors such as product launches, promotions, discounts, and holidays on predicted sales. 
  Include seasonality variables to explain seasonal fluctuations in the sales time series. 

Our Solution:

We built a Sales Forecasting Engine on Microsoft Azure Databricks that allowed our client to quickly and accurately predict sales.

Solution Design:

We worked with the client’s marketing operations and finance teams to collect and analyze their sales data, promotion and discount data, and store events data. We also used National Oceanic and Atmospheric Administration (NOAA) historical weather data from the US government to develop the weather model. We extrapolated the historical data and used application programming interfaces (APIs) to connect the data to our machine learning (ML) model to predict weather.

Highlights:

  Used R libraries and custom functions to cleanse and preprocess the data. 
  Used descriptive statistical analysis to tackle skewness and kurtosis for the features. 
  Performed Fourier transforms to decompose sales, analyze trends, and remove noise from the sales time series. 
  Applied logarithmic, exponential, and S-curve transformations to features to introduce nonlinearity as per real scenarios. 
  Developed hybrid regression models to predict future sales using nonlinear, multiplicative, probabilistic, regularized, and deep learning approaches. 
Figure 1: Architecture of Forecasting Engine

Business Outcomes:

Our supervised ML predictive model empowered our client to analyze the impact of weather, promotions, discounts, product launches, holidays, and daily events on sales and execute business decisions accordingly. The model also identified the delay between an event and the seasonal spike, which enabled our client to maximize sales following an event. 

Our hybrid ML model is far more accurate than the previous ML model. The prediction runs on an aggregated and daily basis, and the model retrains itself once actual sales figures are injected into the model.

Our model’s Mean Absolute Percentage Error (MAPE) value was 0.09—as compared to the previous model’s MAPE value of 0.13. (a lower value indicates greater accuracy). 

Highlights:

    Forecasted sales depending on weather variations for the client’s store at a specific longitude and latitude.
    Analyzed the positive and negative impacts of daily events such as discounts, promotions, launch events, and holidays on predicted and actual sales.
    Statistically identified and explained seasonal spikes in sales time series.
    Identified the lag period for daily events to explain the behavior in time series.

Wednesday, November 10, 2021

Power BI performance factors: what impacts report performance?



There is little more frustrating than a slow-loading Power BI report. When you’re working with billions of rows and columns, it can feel like improving performance is impossible. At MAQ Software, we’ve worked with over 8,000 Power BI reports across a number of industries. In our experience, it is never impossible to improve your report performance. In fact, our goal for all report pages is to load within 8 seconds (at most).

So, how do we do it? By taking a structured approach. The first step is identifying the main areas that impact report performance. After all, to diagnose the problem, you need to know where to look. Typically, there are four major factors that affect Power BI performance: the data set, data sources, report design, and network issues.

The Data Set

Performance win: Reduce your data set size

The size and characteristics of your data set can drastically impact your final dashboard. You should ask yourself questions like what can you consolidate or eliminate? If you’re working with a lot of rows, do you need them all? What information is your business audience actually using?

Taking a good hard look at your data set size doesn’t mean you can’t work with big data sets – Power BI is absolutely designed to handle large volumes of real-time data. It’s about carefully identifying what you can keep and what needs to go. A few of the most common performance detractors within the data set include:

  • Whitespace
  • Null values
  • High column cardinality (i.e., columns with values that are very uncommon or unique, such as user names or user IDs)

Performance win: Optimize your data set model

You’re also going to want to look at your data model. When it comes to optimizing report performance, reducing the size of your model offers the best possible return on investment. The smaller your model, the faster it will run in the report. While different data sets require different models, there are a couple best practices you should follow that provide quick wins for performance:

  1. Use star schema
  2. Star schema is, by far, the best model to use in Power BI. In the star schema, dimension tables align with fact tables (giving it its eponymous star shape). Its alternative, the snowflake schema, uses subdivisions that represent an additional join in your queries. In Power BI, joins translate to slow loading. The fewer you have, the better.

  3. Turn off Auto Date/Time
  4. Auto Date/Time enables users to easily drill down into calendar time periods without having to implement a date dimension. However, this means that for every date column in your data, there is a hidden date/time table in your model. In large data sets, this adds up; your data model could end up massive and sluggish.

  5. Summarize metrics (where possible)
  6. Your raw data may pull information for daily, or even hourly sales, but do your end users actually need reporting at this level? If they only need to know overall monthly sales, you can significantly reduce your model size just by summarizing data by month rather than hour.

  7. Select the right dataset model for your data
  8. In general, the Import model offers the fastest performance thanks to its use of in-memory querying. The Import model imports data to a stored disk, so its query results are extremely fast (as long as the data is fully loaded into the Power BI memory). However, data models are not one-size-fits-all. If you need to work with data volumes that are too large to load complete into the model, or need to deliver real-time data, you should consider using DirectQuery.

Performance win: Optimize your measures

The efficiency of your DAX directly impacts the amount of time it takes a query to render data in a chart. Your best bet? Follow DAX best practices. Some quick wins you can implement today include:

  • Reducing the number of operations within your DAX
    • Before: Max Value:=IF(A>B, 1.1 * A, 1.1 * B)
    • After: Max Value:=1.1 * MAX(A,B)
  • Avoiding both-directional relationships in the data model (where both tables in a relationship cross-filter each other)
  • Moving row-level logic to Power Query (using M to calculate instead of DAX)
  • Avoiding floating point data types
  • Using Divide instead of \
  • Creating a flag in the table instead of having multiple values in a single IN clause (we improved performance by ~2 seconds with this change)
    • Before: Measure 1:= IF (Status IN { "Open", "Closed", "In-Progress"},[Actual], [Target])
    • After: Measure 1:=IF (Status = 1,[Actual], [Target])

Data Sources

Performance win: Consider the cloud

The type of data source you connect to your reports affects report performance. One of today’s hottest topics, especially with the advent of hybrid work, is the cloud. More and more businesses are relying on cloud reporting to share insights across the world. While each organization has to customize their system to fit their needs, we’ve seen some incredible performance wins through cloud-based reporting. In one scenario, we reduced a client’s data processing time from half an hour to two minutes by migrating them to the cloud.

Performance win: Track your actual needs instead of your assumed needs

Ask yourself:

  • Are you using a Tabular model or cube and why?
  • Do you have Geo replication enabled (and do you need to)?
  • Are you using load balancer (and should you be)?
  • Is the data configuration correct?

One of the most important things to consider is the end user’s experience. You don’t necessarily need everything to load fast. You just need to ensure users can quickly access the information they regularly rely upon.

For example: Power BI maintains a cache for dashboard tiles. Pulling data from the cache is faster and more reliable than querying the data source itself. If your users primarily need at-a-glance information, you can make your dashboards the user landing page and pin the most-used visuals. This way, you’ll deliver better user experience at a fraction of the performance cost.

Report Design

Performance win: Filter your data

We all know that report design impacts user experience, but it also has a noticeable effect on report performance. After all, the more data each visual needs to display, the slower the visual will load. Design-wise, there are a couple of big-ticket items to watch out for.

Avoid using unfiltered data. Usually, users don’t need every single row and column of every table every time they open a report. Use Top N filter to reduce the maximum number of items displayed in the table. This reduces the load on the report, improving performance.

You should also be careful when it comes to slicers. Slicers are a great way to help users navigate data, but they tank your report performance. This is because slicers always generate two queries: one to fetch data and one to fetch selection details. If you absolutely need to include slicers, use the Filter pane to evaluate which slicers are used most often, and implement only those.

Performance win: Limit your visuals

Using too many visuals in a single report turns report performance into a slog (and makes your reports difficult to read). Be mindful about which visuals you implement. In general, you should use the following Power BI performance guidelines:

  • Maximum number of widgets: 8
  • Maximum number of grids: 1
  • Maximum number of tiles: 10

Not all visual types perform the same. Grids, for example, are a massive drain on resources, while cards are a much more efficient information delivery system. To optimize the performance of your reports, you should limit each report page to a maximum of 30 total points using the following scoring system:

  • Cards: 1 point each
  • Gauges: 2 points each
  • Charts: 3 points each
  • Maps: 3 points each
  • Grids: 5 points each

Some visuals are also more efficient than others. Out-of-the-box visuals are traditionally faster than custom ones, as they’ve been vetted and created by the Power BI team. However, custom visuals have their merits, especially if you’re looking for something niche. When using custom visuals, prioritize visuals that have been certified. Certified visuals have the yellow checkmark next to them on AppSource, which means they have been certified by the Power BI team for performance and security.

Performance win: Limit interactivity

A final design element to watch out for is interactivity. The interactivity of your report is going to impact performance. The more interactive, the slower the report, as Power BI needs to process several requests before displaying the final result. By default, all visuals on a report page are set to interact with one another. Usually, this level of interactivity isn’t needed for end users, and results in several unnecessary queries in the back end. By reducing interactivity to only the scenarios needed by users, you can drastically improve report performance.

Network Issues

Of course, the final reason why your reports may be failing or loading slowly is your actual network. If this is an immediate issue (like your entire family arriving home and jumping onto the Wi-Fi at the exact same time), then you can wait a while and try again, or consider finding a location with better access. If you’re using a cloud-based Power BI report, you may also be experiencing network latency because of client machine resources or noisy neighbors.

Performance win: Make sure your report regions align

There are some network latency issues that you unfortunately don’t have much control over. Luckily, there are also several issues you can immediately address. Network latency affects the time it takes for requests to go back and forth from the Power BI service. Different tenants in Power BI are assigned to different regions. Ideally, you want your reports, tenant, and data sources in the same region. This reduces network latency by increasing the speed of data transfer and query execution.

Performance win: Configure your Power BI workloads

Network latency may be a result of unoptimized Power BI capacity settings. Optimize your capacity settings to your actual usage metrics, identifying when you should invest in advanced Power BI capacities such as Power BI Premium, Premium Per User (PPU), and Power BI Embedded. Overinvesting in Power BI can result in wasted expenses, but underinvesting can hamper the performance of key reports and dashboards.

Performance win: Manage your gateways

Whenever Power BI needs to access data that isn’t accessible over the Internet, it uses a gateway. Depending on your workload and gateway size, you need to evaluate whether you want to install an on-premises/enterprise data gateway, personal gateway, or VM-hosted infrastructure-as-a-service.

As a rule of thumb, if you’re working with larger databases, it’s better to go with an enterprise gateway rather than personal gateway. Enterprise gateways import no data into Power BI, making them more efficient for big data. You can also create a data cluster for any high-demand queries. This enables you to effectively load balance your gateway traffic and avoid single points of failure.

Finally, make sure you use separate gateways for Power BI service live connections and scheduled data refresh. If you’re using a single gateway for both, your live connection performance will suffer during the scheduled refresh.

***

Okay, so now you know how to diagnose the problem. You might be asking yourself: what comes next? If you want to learn more about specific steps you can take to improve your Power BI setup, check out our Power BI best practices guide. If optimizing your Power BI all on your own sounds daunting, get in touch with us at Sales@MAQSoftware.com. We'd be happy to help!

Up Next

Wednesday, October 27, 2021

Is Cloud Security Part of Cybersecurity?



The short answer: yes. Cloud security is a category of cybersecurity the way an apple is a category of fruit. All apples are fruit; not all fruit is an apple. The definition of cloud security is generally something along the lines of:

Cloud security is a branch of cybersecurity dedicated to securing cloud systems from both internal and external threats.

If you were looking for the short answer, there you have it. Cloud security is a part of cybersecurity. If you want to know why, or how the approach to cloud security differs, read on.

What does cloud security cover?

Cloud security covers a wide range of processes and technologies used to secure cloud systems. This includes:

    Identity management
    Network security
    Infrastructure-level security
    Application-level security
    Data security
    Governance and threat protection

Identity management refers to the process of authenticating and authorizing identities. It’s about verifying who can access your data and how. Popular forms of identity management include multi-factor authentication and directory services such as Azure Active Directory. According to Microsoft, “Many consider identity to be the primary perimeter for [cloud] security. This is a shift from the traditional focus on network security.”

Network security refers to the process of protecting your resources from unauthorized access via network traffic controls. With network security, your aim is to ensure you only allow legitimate traffic. In cloud security, your focus is on limiting connectivity between virtual networks when possible.

Infrastructure-level security refers to security measures taken to protect your entire cloud infrastructure, including policies, applications, technologies, and controls. One of the key areas here revolves around implementing antimalware software and virtual machine (VM) best practice.

Application-level security refers to the protective measures surrounding information exchanged in collaborative cloud environments, such Microsoft Teams, Office 365, or shared Power BI reports.

Data security refers to cloud admins’ ability to secure data through encryption or virtual private networks (VPNs). Encryption is one of the best ways for enterprises to secure their data while VPNSs are extremely popular among consumers and remote workers.

Governance and threat protection refer to how you identify and mitigate incoming threats. This covers one of the most important elements of cloud security: user awareness and training. To secure your cloud, you need to ensure your cloud users are up to date on the latest security protocols and org-wide policies. After all, user error accounts for up to 95% of cloud breaches.

What does cybersecurity cover?

Cybersecurity covers all activities focused on defending computers, servers, mobile devices, digital systems, networks, and data from malicious attacks. It’s a much larger area of digital security that includes:

    Data security
    Identity management
    Network security
    App and software security
    Data retention
    User education

You’ll notice that cloud security and cybersecurity tread a lot of the same ground. That said, cloud security is not just “the same thing as cybersecurity, but with cloud.” Cloud security is a unique area of cybersecurity that has grown exponentially in the last decade. There are some key differences that affect the way security admins approach their role.

What’s the difference between cloud security and cybersecurity?

Cloud security inherently requires buy-in from both the cloud vendor and the cloud buyer to ensure the system is secure. On the buyer end, this means defining your organization’s relationship to the cloud. Buyer considerations include whether you’re using a cloud-native or hybrid system and whether you want to invest in Infrastructure-as-a-Service (IaaS), Platform-as-a-Service (PaaS), or Software-as-a-Service (SaaS). You also need to ensure you trust the vendor provisioning the cloud system. After all, because the storage is off-prem, it’s up to the vendor to secure their system.

Cloud systems are centralized and rapidly scalable in a way that requires you to regularly ensure you are implementing security best practices. All your data is in a single location, from harmless files like handover documentation to business-halting information about sales and projections, accessible 24/7 from anywhere. This means you need to invest in data security systems to avoid unauthorized or accidental access.

Cloud systems also scale more quickly than previous data storage systems. This emphasizes the importance of training not only for users but for your network admins. They need to ensure they are up-to-date on the latest security protocols. It’s why we at MAQ Software regularly reassess our security principles and are certified with ISO/IEC 27001 for information security management and ISO 27018 for cloud security.

***

According to Forbes, by 2025, half of all the world’s data will reside in a public cloud. Cloud security is here to stay, a permanent part of the cybersecurity landscape. If you want to learn more about how we can help you secure your cloud while still reducing your average monthly costs, check out our cloud migration and modernization hub.

Up Next