The Need for Operational Analytics

RN

Rick Negrin

VP, Product Management

The Need for Operational Analytics

The proliferation of streaming analytics and instant decisions to power dynamic applications, as well as the rise of predictive analytics, machine learning, and operationalized artificial intelligence, have introduced a requirement for a new type of database workload: operational analytics.

The two worlds of transactions and analytics, set apart from each other, are a relic of a time before data became an organization’s most valuable asset. Operational analytics is a new set of database requirements and system demands that are integral to achieving competitive advantage for the modern enterprise.

This new approach was called for by Gartner as a Top 10 technology for 2019, under the name “continuous analytics.” Delivering operational analytics at scale is the key to real-time dashboards, predictive analytics, machine learning, and enhanced customer experiences which differentiate digital transformation leaders from the followers.

However, companies are struggling to build these new solutions because existing legacy database architectures cannot meet the demands placed on them. The existing data infrastructure cannot scale to the load put on it, and it doesn’t natively handle all the new sources of data.

The separation of technologies between the transactional and analytic technologies results in hard tradeoffs that leave solutions lacking in operational capability, analytics performance, or both. There have been many attempts in the NoSQL space to bridge the gap, but all have fallen short of meeting the needs of this new workload.

Operational analytics enables businesses to leverage data to enhance productivity, expand customer and partner engagement, and support orders of magnitude more simultaneous users. But these requirements demand a new breed of database software that goes beyond the legacy architecture.

The industry calls these systems by several names: hybrid transaction and analytics processing (HTAP) from Gartner; hybrid operational/analytics processing (HOAP) from 451 Research; and translytical from Forrester.

Consulting firms typically use the term we have chosen here, operational analytics, and CapGemini has even established a full operational analytics consultancy practice around it.

the-emergence-of-operational-analyticsThe Emergence of Operational Analytics

Operational Analytics has emerged alongside the existing workloads of Online Transaction Processing (OLTP) and Online Analytical Processing (OLAP). I outlined the requirements of those workloads in this previous blog entry.

To summarize, OLTP requires data lookups, transactionality, availability, reliability, and scalability. Whereas OLAP requires support for running complex queries very fast, large data sets, and batch ingest of large amounts of data. The OLTP and OLAP based systems served us well for a long time. But over the last few years things have changed.

decisions-should-not-have-to-wait-for-the-dataDecisions should not have to wait for the data

It is no longer acceptable to wait for the next quarter, or week, or even day to get the data needed to make a business decision. Companies are increasingly online all the time; “down for maintenance” and “business hours” are quickly becoming a thing of the past. Companies that have a streaming real-time data flow have a significant edge over their competitors. Existing legacy analytics systems were simply not designed to work like this.

companies-must-become-insight-drivenCompanies must become insight driven

This means that, instead of a handful of analysts querying the data, you have hundreds or thousands of employees hammering your analytics systems every day in order to make informed decisions about the business. In addition, there will be automated systems – ML/AI and others – also running queries to get the current state of the world to feed their algorithms. The existing legacy analytics systems were simply not designed for this kind of usage.

companies-must-act-on-insights-to-improve-customer-experienceCompanies must act on insights to improve customer experience

Companies want to expose their data to their customers and partners. This improves the customer experience and potentially adds net new capabilities. For example, a cable company tracks users as they try to set up their cable modems so they can proactively reach out if they see there is a problem. This requires a system that can analyze and react in real time.

Another example is an electronics company that sells smart TVs and wants to expose which shows customers are watching to its advertisers. This dramatically increases the number of users trying to access its analytics systems.

In addition, the expectations of availability and reliability are much higher for customers and partners. So you need a system that can deliver an operational service level agreement (SLA). Since your partners don’t work in your company, it means you are exposing the content outside the corporate firewall, so strong security is a must. The existing legacy analytics systems were simply not designed for this kind of usage.

data-is-coming-from-many-new-sources-and-in-many-types-and-formatsData is coming from many new sources and in many types and formats

The amount of data being collected is growing tremendously. Not only is it being collected from operational systems within the company; data is also coming from edge devices. The explosion of IoT devices, such as oil drills, smart meters, household appliances, factory machinery, etc., are the key contributors to the growth.

All this data needs to be fed into the analytics system. This leads to an increased complexity in the types of data sources (such as Kafka, Spark, etc…) as well as data types and formats (geospatial, JSON, AVRO, Parquet, raw text, etc.) and throughput requirements for ingest of the data. Again, the existing legacy analytics systems were simply not designed for this kind of usage.

the-rise-of-operational-analyticsThe Rise of Operational Analytics

These changes have given rise to a new database workload, operational analytics. The short description of operational analytics is an analytical workload that needs an operational SLA. Now let’s unpack what that looks like.

operational-analytics-as-a-database-workloadOperational Analytics as a Database Workload

Operational analytics primarily describes analytical workloads. So the query shapes and complexity are similar to OLAP queries. In addition, the data sets for operational analytics are just as large as OLAP data sets, although often it is the most recent data that is more important. (This is usually a fraction of the total data set.) Data loading is similar to OLAP workloads, in that data comes from an external source and is loaded independent of the applications or dashboards that are running the queries.

But this is where the similarities end. Operational analytics has several characteristics that set it apart from pure OLAP workloads. Specifically, the speed of data ingestion, scaling for concurrency, availability and reliability, and speed of query response.

Operational analytics workloads require an SLA on how fast the data needs to be available. Sometimes this is measured in seconds or minutes, which means the data infrastructure must allow streaming the data in constantly, while still allowing queries to be run.

Sometimes this means there’s a window of time (usually a single-digit number of hours) during all the data must be ingested. As data sets grow, the existing data warehouse (DW) technologies have had trouble loading the data within the time window (and certainly don’t allow streaming). Data engineers often have to do complex tricks to continue meeting data loading SLAs with existing DW technologies.

Data also has to be loaded from a larger set of data sources than in the past. It used to be that data was batch-loaded from an operational system during non-business hours. Now data comes in from many different systems.

In addition, data can flow from various IoT devices far afield from the company data center. The data gets routed through various types of technologies (in-memory queues like Kafka, processing engines like Spark, etc.). Operational analytics workloads need to easily handle ingesting from these disparate data sources.

Operational analytics workloads also need to scale to large numbers of concurrent queries. With the drive towards being data driven and exposing data to customers and partners, the number of concurrent users (also queries) in the system has increased dramatically. In an OLAP workload, five to ten queries at a time was the norm. Operational analytics workloads often must be able to handle high tens, hundreds, or even thousands of concurrent queries.

As in an OLTP workload, availability and reliability are also key requirements. Because these systems are now exposed to customers or partners, the SLA required is a lot stricter than for internal employees.

Customers expect a 99.9% or better uptime and they expect the system to behave reliably. They are also less tolerant of planned maintenance windows. So the data infrastructure backing these systems needs to have support for high availability, with the ability to handle hardware and other types of failure.

Maintenance operations (such as upgrading the system software or rebalancing data) need to become transparent, online operations that are not noticeable to the users of the system. In addition, the system should self-heal when a problem occurs, rather than waiting for an operator to get alerted to an issue and respond.

Strong durability is important as well. This is because even though data that is lost could be reloaded, the reloading may cause the system to break the availability SLA.

The ability to retrieve the data you are looking for very quickly is the hallmark feature of database systems. Getting access to the right data quickly is a huge competitive advantage. Whether it is internal users trying to get insights into the business, or you are presenting analytics results to a customer, the expectation is that the data they need is available instantly.

The speed of the query needs to be maintained regardless of the load on the system. It doesn’t matter if there is a peak number of users online, the data size has expanded, or there are failures in the system. Customers expect you to meet their expectations on every query with no excuses.

This requires a solid distributed query processor that can pick the right plan to answer any query and get it right every time. It means the algorithms used must scale smoothly with the system as it grows in every dimension.

supporting-operational-analytics-use-cases-with-single-storeSupporting Operational Analytics Use Cases with SingleStore

SingleStore was built to address these requirements in a single converged system. SingleStore is a distributed relational database that supports ANSI SQL. It has a shared-nothing, scale-out architecture that runs well on industry standard hardware.

This allows SingleStore to scale in a linear fashion simply by adding machines to a cluster. SingleStore supports all the analytical SQL language features you would find in a standard OLAP system, such as joins, group by, aggregates, etc.

It has its own extensibility mechanism so you can add stored procedures and functions to meet your application requirements. SingleStore also supports the key features of an OLTP system: transactions, high availability, self-healing, online operations, and robust security.

It has two storage subsystems: an on-disk column store that gives you the advantage of compression and extremely fast aggregate queries, as well as an in-memory row store that supports fast point queries, aggregates, indices, and more. The two table types can be mixed in one database to get the optimal design for your workload.

Finally, SingleStore has a native data ingestion feature, called Pipelines, that allows you to easily and very quickly ingest data from a variety of data sources (such as Kafka, AWS S3, Azure Blob, and HDFS). All these capabilities offered in a single integrated system add up to making it the best data infrastructure for an operational analytics workload, bar none.

Describing the workload in general terms is a bit abstract, so let’s dig into some of the specific use cases where operational analytics is the most useful.

portfolio-analyticsPortfolio Analytics

One of the most common use cases we see in financial services is portfolio analytics. Multiple SingleStore customers have written financial portfolio management and analysis systems that are designed to provide premium services to elite users.

These elite users can be private banking customers with high net worth or fund managers who control a large number of assets. They will have large portfolios with hundreds or thousands of positions. They want to be able to analyze their portfolio in real-time, with graphical displays that are refreshed instantaneously as they filter, sort, or change views in the application. The superb performance of SingleStore allows sub-second refresh of the entire screen with real-time data, including multiple tables and charts, even for large portfolios.

These systems also need to scale to hundreds or thousands of users concurrently hitting the system, especially when the market is volatile. Lastly, they need to bring in the freshest market data, without compromising the ability to deliver the strict latency SLAs for their query response times.

They need to do all of this securely without violating relevant compliance requirements nor the trust of their users. High availability and reliability are key requirements, because the market won’t wait. SingleStore is the ideal data infrastructure for this operational analytics use case as it solves the key requirements of fast data ingest, high scale concurrent user access, and fast query response.

predictive-maintenancePredictive Maintenance

Another common use case we see is predictive maintenance. Customers who have services or devices that are running continuously want to know as quickly as possible if there is a problem.

This is a common scenario for media companies that do streaming video. They want to know if there is a problem with the quality of the streaming so they can fix it, ideally before the user notices the degradation.

This use case also comes up in the energy industry. Energy companies have devices (such as oil drills, wind turbines, etc.) in remote locations. Tracking the health of those devices and making adjustments can extend their lifetime and save millions of dollars in labor and equipment to replace them.

The key requirements are the ability to stream the data about the device or service, analyze the data – often using a form of ML that leverages complex queries – and then send an alert if the results show any issues that need to be addressed. The data infrastructure needs to be online 24/7 to ensure there is no delay in identifying these issues.

personalizationPersonalization

A third use case is personalization. Personalization is about customizing the experience for a customer. This use case pops in a number of different verticals, such as a user visiting a retail web site, playing a game in an online arcade, or even visiting a brick and mortar store.

The ability to see a user’s activity and, more importantly, learn what is attractive to them, gives you the information to meet their needs more effectively and efficiently. One of SingleStore’s customers is a gaming company. They stream information about the user’s activity in the games, process the results against a model in SingleStore, and use the results to offer the user discounts for new games and other in-app purchases.

Another example is a popular music delivery service that uses SingleStore to analyze usage of the service to optimize ad spend. The size of data and the number of employees using the system made it challenging to deliver the data in a timely way to the organization and allow them to query the data interactively. SingleStore significantly improved their ability to ingest and process the data and allowed their users to get a dramatic speedup in their query response times.

summarySummary

Operational analytics is a new workload that encompasses the operational requirements of an OLTP workload – data lookups, transactionality, availability, reliability, and scalability – as well as the analytical requirements of an OLAP workload – large data sets and fast queries.

Coupled with the new requirements of high user concurrency and fast ingestion, the operational analytics workload is tough to support with a legacy database architecture or by cobbling together a series of disparate tools. As businesses continue along their digital transformation journey they are finding more and more of their workloads fit this pattern and are searching for new modern data infrastructure, like SingleStore, that has the performance and scale capabilities to handle them.


Share