Welcome!

@BigDataExpo Authors: ManageEngine IT Matters, Pat Romanski, Elizabeth White, Liz McMillan, Shelly Palmer

Related Topics: @BigDataExpo, Microservices Expo, Containers Expo Blog, Agile Computing, @CloudExpo, Apache

@BigDataExpo: Article

Babies, Big Data, and IT Analytics

Machine learning is a topic that has gone from obscure niche to mainstream visibility over the last few years

Machine learning and IT analytics can be just as beneficial to IT operations as it is for monitoring vital signs of premature babies to identify danger signs too subtle or abnormal to be detected by a human. But an enterprise must be willing to implement monitoring and instrumentation that gathers data and incorporates business activity across organizational silos in order to get meaningful results from machine learning.

Machine learning is a topic that has gone from obscure niche to mainstream visibility over the last few years. High profile software companies like Splunk have tapped into the Big Data "explosion" to highlight the benefits of building systems that use algorithms and data to make decisions and evolve over time.

One recent article on machine learning on the O'Reilly Radar blog that caught my attention made a connection between web operations and medical care for premature infants. "Operations, machine learning, and premature babies" by Mike Loukides describes how machine learning is used to analyze data streamed from dozens of monitors connected to each baby. The algorithms are able to detect dangerous infections a full day before any symptoms are noticeable to a human.

An interesting point from the article is that the machine learning system is not looking for spikes or irregularities in the data; it is actually looking for the opposite. Babies who are about to become sick stop exhibiting the normal variations in vital signs shown by healthy babies. It takes a machine learning system to detect changes in behavior too subtle for a human to notice.

Mike Loukides then wonders whether machine learning can be applied to web operations. Typical performance monitoring focuses on thresholds to identify a problem. "But what if crossing a threshold isn't what indicates trouble, but the disappearance (or diminution) of some regular pattern?" Machine learning could identify symptoms that a human fails to identify because he's just looking for thresholds to be crossed.

Mike's conclusion sums up much of the state of the IT industry concerning machine learning:

At most enterprises, operations have not taken the next step. Operations staff doesn't have the resources (neither computational nor human) to apply machine intelligence to our problems. We'd have to capture all the data coming off our servers for extended periods, not just the server logs that we capture now, but any every kind of data we can collect: network data, environmental data, I/O subsystem data, you name it.

As someone who works for a company that applies a form of machine learning (Behavior Learning for predictive analytics) to IT operations and application performance management, I read this with great interest. I didn't necessarily disagree with his conclusion but tried to pull apart the reasoning behind why more companies aren't applying algorithms to their IT data to look for problems.

There are at least three requirements for companies who want to move ahead in this area:

1. Establish maturity of one's monitoring infrastructure. This is the most fundamental point. If you want to apply machine intelligence to IT operations then you need to first add instrumentation and monitoring. Numerous monitoring products and approaches abound but you have to get the data before you can analyze it.

2. Coordinate multiple enterprise silos. Modern IT applications are increasingly complex and may cross multiple enterprise silos such as server virtualization, network, databases, application development, and other middleware components. Enterprises must be willing to coordinate between these multiple groups in gathering monitoring data and performing cross-functional troubleshooting when there are performance or uptime issues.

3. Incorporate business activity monitoring (BAM). Business activity data provides the "vital signs" of a business. Examples of retail business activity data include number of units sold, total gross sales, and total net sales for a time period. Knowing the true business impact of an application performance problem requires the correlation of business data. When an outage occurred for 20 minutes, how many fewer units were sold? What was the reduction in gross and net sales?

An organization that can fulfill these requirements is capable of achieving real benefits in IT operations and can successfully apply analytics. Gartner has established the ITScore Maturity Model for determining one's sophistication in availability and performance monitoring. Here is the description for level 5, which is the top tier:

Behavior Learning engines, embedded knowledge, advanced correlation, trend analysis, pattern matching, and integrated IT and business data from sources such as BAM provide IT operations with the ability to dynamically manage the IT infrastructure in line with business policy.

Applying machine learning to IT operations isn't easy. Most enterprises don't do it because they need to overcome organizational inertia and gather data from multiple groups scattered throughout the enterprise. For the organizations willing to do this, however, they will see tangible business benefits. Just as a hospital could algorithmically detect the failing health of a premature infant, an enterprise willing to use machine learning will visibly see how abnormal problems within IT operations can impact revenue.

More Stories By Richard Park

Richard Park is Director of Product Management at Netuitive. He currently leads Netuitive's efforts to integrate with application performance and cloud monitoring solutions. He has nearly 20 years of experience in network security, database programming, and systems engineering. Some past jobs include product management at Sourcefire and Computer Associates, network engineering and security at Booz Allen Hamilton, and systems engineering at UUNET Technologies (now part of Verizon). Richard has an MS in Computer Science from Johns Hopkins, an MBA from Harvard Business School, and a BA in Social Studies from Harvard University.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@BigDataExpo Stories
The IoT is changing the way enterprises conduct business. In his session at @ThingsExpo, Eric Hoffman, Vice President at EastBanc Technologies, discussed how businesses can gain an edge over competitors by empowering consumers to take control through IoT. He cited examples such as a Washington, D.C.-based sports club that leveraged IoT and the cloud to develop a comprehensive booking system. He also highlighted how IoT can revitalize and restore outdated business models, making them profitable ...
"When you think about the data center today, there's constant evolution, The evolution of the data center and the needs of the consumer of technology change, and they change constantly," stated Matt Kalmenson, VP of Sales, Service and Cloud Providers at Veeam Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with the 19th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world and ThingsExpo Silicon Valley Call for Papers is now open.
We all know the latest numbers: Gartner, Inc. forecasts that 6.4 billion connected things will be in use worldwide in 2016, up 30 percent from last year, and will reach 20.8 billion by 2020. We're rapidly approaching a data production of 40 zettabytes a day – more than we can every physically store, and exabytes and yottabytes are just around the corner. For many that’s a good sign, as data has been proven to equal money – IF it’s ingested, integrated, and analyzed fast enough. Without real-ti...
Actian Corporation has announced the latest version of the Actian Vector in Hadoop (VectorH) database, generally available at the end of July. VectorH is based on the same query engine that powers Actian Vector, which recently doubled the TPC-H benchmark record for non-clustered systems at the 3000GB scale factor (see tpc.org/3323). The ability to easily ingest information from different data sources and rapidly develop queries to make better business decisions is becoming increasingly importan...
Cloud analytics is dramatically altering business intelligence. Some businesses will capitalize on these promising new technologies and gain key insights that’ll help them gain competitive advantage. And others won’t. Whether you’re a business leader, an IT manager, or an analyst, we want to help you and the people you need to influence with a free copy of “Cloud Analytics for Dummies,” the essential guide to this explosive new space for business intelligence.
You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...
WebRTC is bringing significant change to the communications landscape that will bridge the worlds of web and telephony, making the Internet the new standard for communications. Cloud9 took the road less traveled and used WebRTC to create a downloadable enterprise-grade communications platform that is changing the communication dynamic in the financial sector. In his session at @ThingsExpo, Leo Papadopoulos, CTO of Cloud9, discussed the importance of WebRTC and how it enables companies to focus...
Qosmos has announced new milestones in the detection of encrypted traffic and in protocol signature coverage. Qosmos latest software can accurately classify traffic encrypted with SSL/TLS (e.g., Google, Facebook, WhatsApp), P2P traffic (e.g., BitTorrent, MuTorrent, Vuze), and Skype, while preserving the privacy of communication content. These new classification techniques mean that traffic optimization, policy enforcement, and user experience are largely unaffected by encryption. In respect wit...
Manufacturers are embracing the Industrial Internet the same way consumers are leveraging Fitbits – to improve overall health and wellness. Both can provide consistent measurement, visibility, and suggest performance improvements customized to help reach goals. Fitbit users can view real-time data and make adjustments to increase their activity. In his session at @ThingsExpo, Mark Bernardo Professional Services Leader, Americas, at GE Digital, discussed how leveraging the Industrial Internet a...
IoT generates lots of temporal data. But how do you unlock its value? You need to discover patterns that are repeatable in vast quantities of data, understand their meaning, and implement scalable monitoring across multiple data streams in order to monetize the discoveries and insights. Motif discovery and deep learning platforms are emerging to visualize sensor data, to search for patterns and to build application that can monitor real time streams efficiently. In his session at @ThingsExpo, ...
Choosing the right cloud for your workloads is a balancing act that can cost your organization time, money and aggravation - unless you get it right the first time. Economics, speed, performance, accessibility, administrative needs and security all play a vital role in dictating your approach to the cloud. Without knowing the right questions to ask, you could wind up paying for capacity you'll never need or underestimating the resources required to run your applications.
Ovum, a leading technology analyst firm, has published an in-depth report, Ovum Decision Matrix: Selecting a DevOps Release Management Solution, 2016–17. The report focuses on the automation aspects of DevOps, Release Management and compares solutions from the leading vendors.
SYS-CON Events announced today that LeaseWeb USA, a cloud Infrastructure-as-a-Service (IaaS) provider, will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. LeaseWeb is one of the world's largest hosting brands. The company helps customers define, develop and deploy IT infrastructure tailored to their exact business needs, by combining various kinds cloud solutions.
Up until last year, enterprises that were looking into cloud services usually undertook a long-term pilot with one of the large cloud providers, running test and dev workloads in the cloud. With cloud’s transition to mainstream adoption in 2015, and with enterprises migrating more and more workloads into the cloud and in between public and private environments, the single-provider approach must be revisited. In his session at 18th Cloud Expo, Yoav Mor, multi-cloud solution evangelist at Cloudy...
Extreme Computing is the ability to leverage highly performant infrastructure and software to accelerate Big Data, machine learning, HPC, and Enterprise applications. High IOPS Storage, low-latency networks, in-memory databases, GPUs and other parallel accelerators are being used to achieve faster results and help businesses make better decisions. In his session at 18th Cloud Expo, Michael O'Neill, Strategic Business Development at NVIDIA, focused on some of the unique ways extreme computing is...
UpGuard has become a member of the Center for Internet Security (CIS), and will continue to help businesses expand visibility into their cyber risk by providing hardening benchmarks to all customers. By incorporating these benchmarks, UpGuard's CSTAR solution builds on its lead in providing the most complete assessment of both internal and external cyber risk. CIS benchmarks are a widely accepted set of hardening guidelines that have been publicly available for years. Numerous solutions exist t...
The cloud market growth today is largely in public clouds. While there is a lot of spend in IT departments in virtualization, these aren’t yet translating into a true “cloud” experience within the enterprise. What is stopping the growth of the “private cloud” market? In his general session at 18th Cloud Expo, Nara Rajagopalan, CEO of Accelerite, explored the challenges in deploying, managing, and getting adoption for a private cloud within an enterprise. What are the key differences between wh...
SYS-CON Events announced today that Venafi, the Immune System for the Internet™ and the leading provider of Next Generation Trust Protection, will exhibit at @DevOpsSummit at 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Venafi is the Immune System for the Internet™ that protects the foundation of all cybersecurity – cryptographic keys and digital certificates – so they can’t be misused by bad guys in attacks...
Security, data privacy, reliability, and regulatory compliance are critical factors when evaluating whether to move business applications from in-house, client-hosted environments to a cloud platform. Quality assurance plays a vital role in ensuring that the appropriate level of risk assessment, verification, and validation takes place to ensure business continuity during the migration to a new cloud platform.