By: Brian Bohan, Vice President of Worldwide Sales Consulting
Operational Intelligence (OI) can be leveraged to overcome the “store just-in-case” mindset to Big Data. I see too many businesses capturing terabytes, and in some cases petabytes, of data “just-in-case” it might come in handy for some future initiative. In these situations, the data in question has not been mapped into any known business decision and action process, but instead stored just-in-case it might aid in some decision in the future. Why is this the case? I believe it is because dealing with the Volume and Variety of data is no longer the long pole in the tent, thanks to the advances in Big Data frameworks, but mainstreaming this data into core business transactions, processes and actions is far more difficult. The lack of qualified data analysts is well-known. But there is an emerging belief that perhaps it is the amount of data itself that is the issue; or at least the focus on collecting and storing this data without always having a clear idea on how best to put it to use while it still matters.
I have heard on too many occasions, “These routers/meters/devices/PoS systems/social sites throw off petabytes of data, there’s got to be something valuable in there.” Well, perhaps there is, but the challenge is how to pick-out the signal from the noise without a clear problem definition and idea of the actions you would want to take if that signal were identified. So, first, define the problem you want to solve or the opportunity you want to exploit. If this sounds basic…it is. It is fundamental and the fact that we have near unlimited access to data does not change its applicability.
Once we understand the objective, then we can use Operational Intelligence to be more surgical in selecting the data we want to analyze and make decisions against by intelligently filtering the available data to focus on that which is in scope. Possessing a well-defined objective also helps in determining latency requirements. How long of a delay between the event occurrence – note that the ‘event’ here could in fact be several data points or events, making it an event of interest—and action can the organization tolerate? Gartner VP Distinguished Analyst Roy Shulte posits that if it is 15 minutes or less, then continuous real-time analytics and action, or Operational Intelligence, is required in addition to Big Data frameworks, such as Hadoop. If the marker is 15 minutes today and if we consider future trends, then we should probably consider any business decision with a latency tolerance of less than an hour as requiring Operational Intelligence.
The continuous, streaming analytics and sense-and-respond action integral to Operational Intelligence can focus Big Data initiatives and ensure derived insights inform smart and relevant actions in true operational time. In particular, OI can intelligently and dynamically filter Big Data so organizations can focus on what really matters. Additionally, the patterns and models determined via offline Big Data analytics can be fed into the OI layer where streaming events that match can be identified and responded to in true real-time. The key here is understanding the value of continuous analytics and action provided by OI versus ‘fast-batch’ analytics common in Big Data frameworks.