Welcome!

@DXWorldExpo Authors: Liz McMillan, John Walsh, Elizabeth White, Pat Romanski, John Katrick

Related Topics: @ThingsExpo, @CloudExpo, @DXWorldExpo

@ThingsExpo: Article

Difference Between #BigData and Internet of Things | @ThingsExpo #IoT #M2M

What does it mean, as a vendor, to say that you support the Internet of Things (IoT) from an analytics perspective?

A recent argument with folks whose intelligence I hold in high regard (like Tom, Brandon, Wei, Anil, etc.) got me thinking about the following question:

What does it mean, as a vendor, to say that you support the Internet of Things (IoT) from an analytics perspective?

I think the heart of that question really boils down to this:

What are the differences between big data (which is analyzing large amounts of mostly human-generated data to support longer-duration use cases such as predictive maintenance, capacity planning, customer 360 and revenue protection) and IoT (which is aggregating and compressing massive amounts of low latency / low duration / high volume machine-generated data coming from a wide variety of sensors to support real-time use cases such as operational optimization, real-time ad bidding, fraud detection, and security breach detection)?

I don’t believe that loading sensor data into a data lake and performing data science to create predictive analytic models qualifies as doing IoT analytics.  To me, that’s just big data (and potentially REALLY BIG DATA with all that sensor data).  In order for one to claim that they can deliver IoT analytic solutions requires big data (with data science and a data lake), but IoT analytics must also include:

  1. Streaming data management with the ability to ingest, aggregate (e.g., mean, median, mode) and compress real-time data coming off a wide variety of sensor devices “at the edge” of the network, and
  2. Edge analytics that automatically analyzes real-time sensor data and renders real-time decisions (actions) at the edge of the network that optimizes operational performance (blade angle or yaw) or flags unusual performance or behaviors for immediate investigation (security breaches, fraud detection).

If you cannot manage real-time streaming data and make real-time analytics and real-time decisions at the edge, then you are not doing IOT or IOT analytics, in my humble opinion.  So what is required to support these IoT data management and analytic requirements?

The IoT “Analytics” Challenge
The Internet of Things (or Industrial Internet) operates at machine-scale, by dealing with machine-to-machine generated data.  This machine-generated data creates discrete observations (e.g., temperature, vibration, pressure, humidity) at very high signal rates (1,000s of messages/sec).  Add to this the complexity that the sensor data values rarely change (e.g., temperature operates within an acceptably small range).  However, when the values do change the ramifications, the changes will likely be important.

Consequently to support real-time edge analytics, we need to provide detailed data that can flag observations of concern, but then doesn’t overwhelm the ability to get meaningful data back to the core (data lake) for more broad-based, strategic analysis.

One way that we see organizations addressing the IoT analytics needs is via a 3-tier Analytics Architecture (see Figure 1).

Figure 1: IoT Analytics 3-Tier Architecture

We will use a wind turbine farm to help illustrate the 3-tier analytics architecture capabilities.

Tier 1 performs individual wind turbine real-time performance analysis and optimization.  Tier 1 must manage (ingest and compress) real-time data streams coming off of multiple, heterogeneous sensors. Tier 1 analyzes the data, and processes the incoming data against static or dynamically updated analytic models (e.g., rules-based, decision trees) for immediate or near-immediate actions.

Purpose-built T1 edge gateways leverage real-time data compression techniques (e.g., see the article “timeseries storage and data compression” for more information on timeseries databases) to only send a subset of the critical data (e.g., data that has changed) back to T2 and T3 (core).

Let’s say that you are monitoring the temperatures of a compressor inside of a large industrial engine.  Let’s say the average temperature of that compressor is 99 degrees, and only varies between 98 to 100 degrees within a 99% confidence level.  Let’s also say the compressor is emitting the following temperature readings 10 times a second:

99, 99, 99, 98, 98, 99, 99, 98, 99, 99, 100, 99, 99, 99, 100, 99, 98, 99, 99…

You have 10,000 of readings that don’t vary from that range.  So why send all of the readings (which from a transmission bandwidth perspective could be significant)?  Instead, use a timeseries database to only send mean, medium, mode, variances, standard deviation and other statistical variables of the 10,000 readings instead of the individual 10,000 readings.

However, let’s say that all of a sudden we start getting readings outside the normal 99% confidence level:

99, 99, 99, 100, 100, 101, 101, 102, 102, 103, 104, 104, 105, …

Then we’d apply basic Change Data Capture (CDC) techniques to capture and transmit the subset of critical data to T2 and T3 (core).

Consequently, edge gateways leverage timeseries compression techniques to drive faster automated decisions while only sending a subset of critical data to the core for further analysis and action.

The Tier 1 analytics are likely being done via an on-premise analytics server or gateway (see Figure 2).

Figure 2:  IoT Tier 1 Analytics

Tier 2 optimizes performance and predicts maintenance needs across the wind turbines in the same wind farm.  Tier 2 requires a distributed dynamic content processing rule generation and execution analytics engine that integrates and analyzes data aggregated across the potentially heterogeneous wind turbines. Cohort analysis is typical in order to identify, validate and codify performance problems and opportunities across the cohort wind turbines.  For example, in the wind farm, the Tier 2 analytics are responsible for real-time learning that can generate the optimal torque and position controls for the individual wind turbines. Tier 2 identifies and shares best practices across the wind turbines in the wind farm without having to be dependent upon the Tier 3 core analytics platform (see Figure 3).

Figure 3: Tier 2 Analytics: Optimizing Cohort Performance

Tier 3 is the data lake enabled core analytics platform. The tier 3 core analytics platform includes analytics engines, data sets and data management services (e.g., governance, metadata management, security, authentication) that enable access to the data (sensor data plus other internal and external data sources) and existing analytic models that supports data science analytic/predictive model development and refinement.  Tier 3 aggregates the critical data across all wind farms and individual turbines, and combines the sensor data with external data sources which could include weather (humidity, temperatures, precipitation, air particles, etc.), electricity prices, wind turbine maintenance history, quality scores for the wind turbine manufacturers, and performance profiles of the wind turbine mechanics and technicians (see Figure 4).

Figure 4:  Core Analytics for Analytic Model Development and Refinement

With the rapid increase in storage and processing power at the edges of the Internet of Things (for example, the Dell Edge Gateway 3000 Series), we will see more and more analytic capabilities being pushed to the edge.

How Do You Start Your IoT Journey
While the rapidly evolving expertise on the IoT edge technologies can be very exciting (graphical processing units in gateway servers with embedded machine learning capabilities with 100’s of gigabytes of storage), the starting point for the IoT journey must first address this basic question:

How effective is your organization at leveraging data and analytics to power your business (or operational) models?

We have tweaked the Big Data Business Model Maturity Index to help organizations not only understand where they sit on the maturity index with respect to the above question, but also to provide a roadmap for how organizations can advance up the maturity index to become more effective at leveraging the wealth of IOT data with advanced analytics to power their business and operational models (see Figure 5).

Figure 5:  Big Data / IoT Business Model Maturity IndexMaturity Index

To drive meaningful business impact, you will need to begin with the business and not the technology:

  • Engage the business stakeholders on day one,
  • Align the business and IT teams
  • Understand the organization’s key business and operational initiatives, and
  • Identify and prioritize the use cases (decisions/goals) that support those business initiatives.

If you want to monetize your IOT initiatives, follow those simple guidelines and you will dramatically increase the probability of your business and monetization success.

For more details on the Internet of Things revolution, check out these blogs:

The post Difference between Big Data and Internet of Things appeared first on InFocus Blog | Dell EMC Services.

More Stories By William Schmarzo

Bill Schmarzo, author of “Big Data: Understanding How Data Powers Big Business”, is responsible for setting the strategy and defining the Big Data service line offerings and capabilities for the EMC Global Services organization. As part of Bill’s CTO charter, he is responsible for working with organizations to help them identify where and how to start their big data journeys. He’s written several white papers, avid blogger and is a frequent speaker on the use of Big Data and advanced analytics to power organization’s key business initiatives. He also teaches the “Big Data MBA” at the University of San Francisco School of Management.

Bill has nearly three decades of experience in data warehousing, BI and analytics. Bill authored EMC’s Vision Workshop methodology that links an organization’s strategic business initiatives with their supporting data and analytic requirements, and co-authored with Ralph Kimball a series of articles on analytic applications. Bill has served on The Data Warehouse Institute’s faculty as the head of the analytic applications curriculum.

Previously, Bill was the Vice President of Advertiser Analytics at Yahoo and the Vice President of Analytic Applications at Business Objects.

@BigDataExpo Stories
DX World EXPO, LLC, a Lighthouse Point, Florida-based startup trade show producer and the creator of "DXWorldEXPO® - Digital Transformation Conference & Expo" has announced its executive management team. The team is headed by Levent Selamoglu, who has been named CEO. "Now is the time for a truly global DX event, to bring together the leading minds from the technology world in a conversation about Digital Transformation," he said in making the announcement.
"Space Monkey by Vivent Smart Home is a product that is a distributed cloud-based edge storage network. Vivent Smart Home, our parent company, is a smart home provider that places a lot of hard drives across homes in North America," explained JT Olds, Director of Engineering, and Brandon Crowfeather, Product Manager, at Vivint Smart Home, in this SYS-CON.tv interview at @ThingsExpo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
SYS-CON Events announced today that Conference Guru has been named “Media Sponsor” of the 22nd International Cloud Expo, which will take place on June 5-7, 2018, at the Javits Center in New York, NY. A valuable conference experience generates new contacts, sales leads, potential strategic partners and potential investors; helps gather competitive intelligence and even provides inspiration for new products and services. Conference Guru works with conference organizers to pass great deals to gre...
In his Opening Keynote at 21st Cloud Expo, John Considine, General Manager of IBM Cloud Infrastructure, led attendees through the exciting evolution of the cloud. He looked at this major disruption from the perspective of technology, business models, and what this means for enterprises of all sizes. John Considine is General Manager of Cloud Infrastructure Services at IBM. In that role he is responsible for leading IBM’s public cloud infrastructure including strategy, development, and offering m...
"Evatronix provides design services to companies that need to integrate the IoT technology in their products but they don't necessarily have the expertise, knowledge and design team to do so," explained Adam Morawiec, VP of Business Development at Evatronix, in this SYS-CON.tv interview at @ThingsExpo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
To get the most out of their data, successful companies are not focusing on queries and data lakes, they are actively integrating analytics into their operations with a data-first application development approach. Real-time adjustments to improve revenues, reduce costs, or mitigate risk rely on applications that minimize latency on a variety of data sources. In his session at @BigDataExpo, Jack Norris, Senior Vice President, Data and Applications at MapR Technologies, reviewed best practices to ...
Widespread fragmentation is stalling the growth of the IIoT and making it difficult for partners to work together. The number of software platforms, apps, hardware and connectivity standards is creating paralysis among businesses that are afraid of being locked into a solution. EdgeX Foundry is unifying the community around a common IoT edge framework and an ecosystem of interoperable components.
"ZeroStack is a startup in Silicon Valley. We're solving a very interesting problem around bringing public cloud convenience with private cloud control for enterprises and mid-size companies," explained Kamesh Pemmaraju, VP of Product Management at ZeroStack, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Large industrial manufacturing organizations are adopting the agile principles of cloud software companies. The industrial manufacturing development process has not scaled over time. Now that design CAD teams are geographically distributed, centralizing their work is key. With large multi-gigabyte projects, outdated tools have stifled industrial team agility, time-to-market milestones, and impacted P&L stakeholders.
"Akvelon is a software development company and we also provide consultancy services to folks who are looking to scale or accelerate their engineering roadmaps," explained Jeremiah Mothersell, Marketing Manager at Akvelon, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
"Infoblox does DNS, DHCP and IP address management for not only enterprise networks but cloud networks as well. Customers are looking for a single platform that can extend not only in their private enterprise environment but private cloud, public cloud, tracking all the IP space and everything that is going on in that environment," explained Steve Salo, Principal Systems Engineer at Infoblox, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Conventio...
"IBM is really all in on blockchain. We take a look at sort of the history of blockchain ledger technologies. It started out with bitcoin, Ethereum, and IBM evaluated these particular blockchain technologies and found they were anonymous and permissionless and that many companies were looking for permissioned blockchain," stated René Bostic, Technical VP of the IBM Cloud Unit in North America, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Conventi...
In his session at 21st Cloud Expo, Carl J. Levine, Senior Technical Evangelist for NS1, will objectively discuss how DNS is used to solve Digital Transformation challenges in large SaaS applications, CDNs, AdTech platforms, and other demanding use cases. Carl J. Levine is the Senior Technical Evangelist for NS1. A veteran of the Internet Infrastructure space, he has over a decade of experience with startups, networking protocols and Internet infrastructure, combined with the unique ability to it...
22nd International Cloud Expo, taking place June 5-7, 2018, at the Javits Center in New York City, NY, and co-located with the 1st DXWorld Expo will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud ...
Gemini is Yahoo’s native and search advertising platform. To ensure the quality of a complex distributed system that spans multiple products and components and across various desktop websites and mobile app and web experiences – both Yahoo owned and operated and third-party syndication (supply), with complex interaction with more than a billion users and numerous advertisers globally (demand) – it becomes imperative to automate a set of end-to-end tests 24x7 to detect bugs and regression. In th...
Agile has finally jumped the technology shark, expanding outside the software world. Enterprises are now increasingly adopting Agile practices across their organizations in order to successfully navigate the disruptive waters that threaten to drown them. In our quest for establishing change as a core competency in our organizations, this business-centric notion of Agile is an essential component of Agile Digital Transformation. In the years since the publication of the Agile Manifesto, the conn...
"Codigm is based on the cloud and we are here to explore marketing opportunities in America. Our mission is to make an ecosystem of the SW environment that anyone can understand, learn, teach, and develop the SW on the cloud," explained Sung Tae Ryu, CEO of Codigm, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
High-velocity engineering teams are applying not only continuous delivery processes, but also lessons in experimentation from established leaders like Amazon, Netflix, and Facebook. These companies have made experimentation a foundation for their release processes, allowing them to try out major feature releases and redesigns within smaller groups before making them broadly available. In his session at 21st Cloud Expo, Brian Lucas, Senior Staff Engineer at Optimizely, discussed how by using ne...
Vulnerability management is vital for large companies that need to secure containers across thousands of hosts, but many struggle to understand how exposed they are when they discover a new high security vulnerability. In his session at 21st Cloud Expo, John Morello, CTO of Twistlock, addressed this pressing concern by introducing the concept of the “Vulnerability Risk Tree API,” which brings all the data together in a simple REST endpoint, allowing companies to easily grasp the severity of the ...
Coca-Cola’s Google powered digital signage system lays the groundwork for a more valuable connection between Coke and its customers. Digital signs pair software with high-resolution displays so that a message can be changed instantly based on what the operator wants to communicate or sell. In their Day 3 Keynote at 21st Cloud Expo, Greg Chambers, Global Group Director, Digital Innovation, Coca-Cola, and Vidya Nagarajan, a Senior Product Manager at Google, discussed how from store operations and ...