Welcome!

Big Data Journal Authors: Pat Romanski, Elizabeth White, Yeshim Deniz, Trevor Parsons, Adrian Bridgwater

Related Topics: Big Data Journal, Java, SOA & WOA, .NET

Big Data Journal: Article

Detecting Anomalies that Matter!

Like needles in a haystack

As Netuitive's Chief Data Scientist, I am fortunate to work closely with some of the worlds' largest banks, telcos, and eCommerce companies. Increasingly the executives that I speak with at these companies are no longer focused on just detecting application performance anomalies - they want to understand the impact this has on the business.  For example - "is the current slowdown in the payment service impacting sales?"

You can think of it as detecting IT operations anomalies that really matter - but this is easier said than done.

Like Needles in a Haystack
When it comes to IT analytics, there is a general notion that the more monitoring data you are able to consume, analyze, and correlate, the more accurate your results will be. Just pile all that infrastructure, application performance, and business metric data together and good things are bound to happen, right?

Larger organizations typically have access to voluminous data being generated from dozens of monitoring tools that are tracking thousands of infrastructure and application components.  At the same time, these companies often track hundreds of business metrics using a totally different set of tools.

The problem is that, collectively, these monitoring tools do not communicate with each other.  Not only is it hard to get holistic visibility into the performance and health of a particular business service, it's even harder to discover complex anomalies that have business impact.

Anomalies are Like Snowflakes
Compounding the challenge is the fact that no two anomalies are alike.  Anomalies that matter have multiple facets.  They reflect a composite behavior of many layers of interacting and inter-dependent components.  Additionally, they can be cleverly disguised or hidden in a haze of visible but insignificant noise.  No matter how many graphs and charts you display on the largest LCD monitor you can find - the type of scalable real-time analysis required to find and expose what's important is humanly impossible.

Enter IT Operations Analytics
Analytics such as statistical machine learning allow us to understand the "normal" behavior of each resource we are tracking - be it a single IT component, web service, application, or business process. Additional algorithms help us find patterns and correlations between the thousands of IT and business metrics that matter in a critical service.

The Shift Towards IT Operations Analytics is Already Happening
This is not about the future.  It's about what companies are doing today.

Several years ago thought-leading enterprises (primarily large banks with critical revenue driving services) began experimenting with a new breed of IT analytics platform. These companies' electronic and web facing businesses had so much revenue (and reputation) at stake that they needed to find the anomalies that matter -- the ones that were truly indicative of current or impending problems.

Starting with an almost "blank slate", these forward-thinking companies began developing open IT analytics platforms that easily integrated any type of data source in real time to provide a comprehensive view of patterns and relationships between IT infrastructure and business service performance. This was only possible with technologies that leveraged sophisticated data integration, knowledge modeling, and analytics to discover and capture the unique behavior of complex business services.  Anything less would fail, because, like snowflakes, no two anomalies are alike.

The Continuous Need for Algorithm Research
The online banking system at one bank is different than the online system at the next bank.  And the transaction slowdown that occurred last week may have a totally different root cause than the one two months ago.  Even more interesting are external factors such as seasonality and its effects on demand.  For example, payment companies see increased workload around holidays such as Thanksgiving and Mother's Day whereas gaming/betting companies' demand is driven more by factors such as the NFL Playoffs or the World Series.

For this reason, analytics research is an ongoing endeavor at Netuitive - part driven by customer needs and in part by advances in technology.   Once Netuitive technology is installed in an enterprise and integrating data collected across multiple layers in the service stack, behavior learning begins immediately.  As time passes, the statistical algorithms have more observations to feed their results and this leads to increasing confidence in both anomalies detected and proactive forecasts.  Additionally, customer domain knowledge can be layered in to Netuitive's real-time analysis in the form of knowledge bases and supervised learning algorithms.  The Research Group at Netuitive works closely with our Professional Services Group as well as directly with customers to regularly review actual delivered alarm quality to tune the algorithms that we have as well as identify new algorithms that would deliver greater value in an actionable timeframe.

Since Netuitive's software architecture allows for "pluggable" algorithms, we can incrementally introduce new analytics capabilities easily, at first in an experimental or laboratory setting and ultimately, once verified, into production.

The IT operations management market has matured over the past two decades to the point that most critical components are well instrumented.  The data is there and mainstream IT organizations (not just visionary early adopters) realize that analytics deliver measurable and tangible value.   My vision and challenge is to get our platform to the point where customers can easily customize the algorithms on their own, as their needs and IT infrastructure evolve over time.  This is where platforms need to get to because of the endless variety of ways that enterprises must discover and remediate "anomalies that matter".

Stay tuned.  In an upcoming blog I will drill down on some specific industry examples of algorithms we developed as part of some large enterprise IT analytic platform solutions.

More Stories By Elizabeth A. Nichols, Ph.D

As Chief Data Scientist for Netuitive, Elizabeth A. Nichols, Ph.D. leads development of algorithms, models, and analytics. This includes both enriching the company’s current portfolio as well as developing new analytics to support current and emerging technologies and IT-dependent business services across multiple industry sectors.

Previously, Dr. Nichols co-founded PlexLogic, a provider of open analytics services for quantitative data analysis, risk modeling and data visualization. In her role as CTO and Chief Data Scientist, she developed a cloud platform for collecting, cleansing and correlating data from heterogeneous sources, computing metrics, applying algorithms and models, and visualizing results. Prior to Plexlogic, Dr. Nichols co-founded and served as CTO for ClearPoint Metrics, a security metrics software platform that was eventually sold to nCircle. Prior to ClearPoint Metrics, Dr. Nichols served in technical advisory and leadership positions at CA, Legent Corp, BladeLogic, and Digital Analysis Corp. At CA, she was VP of Research and Development and Lead Architect for agent instrumentation and analytics for CA Unicenter. After receiving a Ph.D. in Mathematics from Duke University, she began her career as an operations research analyst developing war gaming models for the US Army.

Latest Stories from Big Data Journal
Enthusiasm for the Internet of Things has reached an all-time high. In 2013 alone, venture capitalists spent more than $1 billion dollars investing in the IoT space. With “smart” appliances and devices, IoT covers wearable smart devices, cloud services to hardware companies. Nest, a Google company, detects temperatures inside homes and automatically adjusts it by tracking its user’s habit. These technologies are quickly developing and with it come challenges such as bridging infrastructure gaps,...
There are 182 billion emails sent every day, generating a lot of data about how recipients and ISPs respond. Many marketers take a more-is-better approach to stats, preferring to have the ability to slice and dice their email lists based numerous arbitrary stats. However, fundamentally what really matters is whether or not sending an email to a particular recipient will generate value. Data Scientists can design high-level insights such as engagement prediction models and content clusters that a...
Cloudian on Tuesday announced immediate availability of Cloudian HyperStore appliances and Cloudian HyperStore 5.0 software. Flash-optimized, rack-ready HyperStore appliances make it easy to economically deploy full-featured, highly scalable S3-compliant storage with three enterprise-focused configurations. HyperStore appliances come fully integrated with Cloudian HyperStore software to assure unlimited scale, multi-data center storage, fully automated data tiering, and support for all S3 applic...
Cloud and Big Data present unique dilemmas: embracing the benefits of these new technologies while maintaining the security of your organization’s assets. When an outside party owns, controls and manages your infrastructure and computational resources, how can you be assured that sensitive data remains private and secure? How do you best protect data in mixed use cloud and big data infrastructure sets? Can you still satisfy the full range of reporting, compliance and regulatory requirements? I...
There is no doubt that Big Data is here and getting bigger every day. Building a Big Data infrastructure today is no easy task. There are an enormous number of choices for database engines and technologies. To make things even more challenging, requirements are getting more sophisticated, and the standard paradigm of supporting historical analytics queries is often just one facet of what is needed. As Big Data growth continues, organizations are demanding real-time access to data, allowing immed...
Whether you're a startup or a 100 year old enterprise, the Internet of Things offers a variety of new capabilities for your business. IoT style solutions can help you get closer your customers, launch new product lines and take over an industry. Some companies are dipping their toes in, but many have already taken the plunge, all while dramatic new capabilities continue to emerge. In his session at Internet of @ThingsExpo, Reid Carlberg, Senior Director, Developer Evangelism at salesforce.com, t...
Scene scenario: 10 am in a boardroom somewhere, second round of coffees served, Danish and donuts untouched, a quiet hush settles. “Well you know what guys? (and, by the use of the term guys I mean to include both sexes here assembled) – the trouble that we have as a company is that we are, to put it bluntly, just a little analytics poor,” said the newly appointed Chief Analytics Officer. That we should consider a firm to be analytically deficient or poor is a profound comment on our modern ag...
Gridstore has announced that NAC, Inc. and Sky Tech have joined its innovative Accelerate Partner Program. Both new members cite Gridstore's expertise in enabling the Hybrid Cloud and their solution purpose-built for Hyper-V as the key criteria for their decision to join the program. Integrating seamlessly with business clients, these new partners provide industry-proven storage solutions that promote satisfied customers, profitable businesses, and communities that thrive.
General Electric (GE) has been a household name for more than a century, thanks in large part to its role in making households easier to run. Starting with the light bulb invented by its founder, Thomas Edison, GE has been selling devices (“things”) to consumers throughout its 122-year history. Last week, GE announced that it is officially leaving that job to others. While the lighting division will stay, GE will now turn its attention to selling industrial machinery and analytics as a service t...
It's time to condense all I've seen, heard, and learned about the IoT into a fun, easy-to-remember guide. Without further ado, here are Five (5) Things About the Internet of Things: 1. It's the end-state of Moore's Law. It's easy enough to debunk the IoT as “nothing new.” After all, we've have embedded systems for years. We've had devices connected to the Internet for decades; the very definition of a network means things are connected to it. But now that the invariable, self-fulfilling prop...