Click here to close now.

Welcome!

BigDataExpo® Blog Authors: Liz McMillan, Hovhannes Avoyan, Harry Trott, Elizabeth White, Pat Romanski

Blog Feed Post

PubSub and Other Disruptive Emerging Network Models

#IoT #OpenStack #SDN

When people write about software-defined architectures being "disruptive" to the network they're doing a bit of a disservice to just how much change is occurring under the hood in the engine that drives today's businesses. The notion of separating control and data planes is superficial in that it describes a general concept and it isn't really all that radical a change, if you think about it.

The control and data planes have always been separate. We have, since the need for web-scale networks came about, implemented separate topological (and usually physical) networks specifically for the purpose of segregating control traffic from the data path. The reasons for this are many: to keep management (control) traffic from interfering with the delivery of applications (and vice versa), to enable a model in which control over the critical path for applications could be secured and to ensure access to necessary control functions in the face of failure or attack.

What's new with software-defined architectures is not just the logical separation but the physical decoupling and change in component responsibility. In traditional networks there is a logically separate control plane, but it is distributed; it resides on each physical component. In software-defined networks it is physically separate but it is centralized; control responsibility resides in a single component, the "controller".

Now, OpenStack and emerging models for scaling systems that will be responsible for managing communication with and for the Internet of Things (like MQTT) use a similar control model, but it's not as active as a software-defined architectural model, it's more a passive model. That's the nature of PubSub imposing itself on the network.

PubSub Control Model

pubsub-modelPubSub (publish / subscribe) is a familiar model to application developers. It's a middleware staple that's been used for a very long time to distribute messages to a variable set of systems. In a nutshell, PubSub is based on the notion of there existing a "queue" to which authorized components can publish events or messages of interest and to which interested components an subscribe. Events or messages have a life (like a TTL) and eventually expire. In the interim, it's expected that subscribed components check the queue for messages periodically. They poll for events or messages.

The queue itself is much like a switch or router's queue, except messages in the queue are duplicated until the TTL runs out and the message expires.

PubSub is passive; that is, it does not actively distribute messages. It merely serves as a kind of centralized repository, making available to those components that need it access to relevant information about the state of applications and/or the network.

Centralized Control Model

push-modelOpenFlow-based SDN, by contrast, is active. That is, it not only serves as a centralized repository for the state of applications and/or the network, but it actively distributes messages to components based on events. For example, a controller might receive a message from another system indicating the launch of a new application instance. That event triggers a series of actions on the controller that includes informing the affected network components of configuration changes. In a passive, PubSub model, the network components themselves might be polling for such an event and, upon receiving one, would initiate the appropriate configuration changes themselves.

We can simplify the description of the differences even more: a controller-based architecture uses a push model, while a pubsub-based architecture uses a pull model. What this doesn't illustrate well is that in a push model, the centralized controller must know how to communicate the desired changes to each and every component it is controlling. That's one of the reasons original models standardized on OpenFlow and were tightly focused on L2-4 stateless networking. It could be easily standardized down to a common forwarding table.While different components might internalize that differently, the basic information was always the same: IP addresses, ports and actions.

As we move up the stack into L4-7 stateful networking, however, this model becomes more burdensome because of the complexity of rule sets and differences in policy models across such a broad set of networking domains. Hence the plug-in support in controllers like OpenDaylight for "other" control protocols. But the basic premise of the model remains the same, regardless of the control protocol: the centralized controller dictates the changes to all components. It pushes those changes to the network. Both control and execution are centralized. The controller tells components to change their configuration.

PubSub centralizes control but decentralizes execution. The control plane is still centralized; there is one authoritative system responsible for disseminating change across the network, but each individual component (or domain controller but we'll get to that in a minute) is responsible for executing the appropriate changes based on their configured policies and services. A pubsub controller never tells a component "change this now"; that's up to the individual components (or domain controller). 

The Integrated Control Model

To make things even more confusing (and disruptive), these models may be used simultaneously. A software-defined architecture might be based on a centralized control model with domain controllers for specific networking functions (like security and application delivery) integrated via a pubsub-based model.

integrated-model

 

This is where we start seeing models that combine emerging technologies like OpenStack and SDN architectures together. OpenStack manages at the data center level, and at its heart is pubsub model that can be used by domain controllers (stateless L2-4 SDN, stateful L4-7 SDN, etc...) to receive notification of changes in the network and subsequently push those changes using the appropriate control protocols to the components it is managing.

Needless to say, the term "disruptive" is really inadequate in describing the level of change in the network required to support either models (or both). Both require significant changes not just to the network itself but the way in which the network is fundamentally provisioned and managed. It's not just a new CLI or management console, these models dramatically change the design and management of networks.

Read the original blog entry...

More Stories By Lori MacVittie

Lori MacVittie is responsible for education and evangelism of application services available across F5’s entire product suite. Her role includes authorship of technical materials and participation in a number of community-based forums and industry standards organizations, among other efforts. MacVittie has extensive programming experience as an application architect, as well as network and systems development and administration expertise. Prior to joining F5, MacVittie was an award-winning Senior Technology Editor at Network Computing Magazine, where she conducted product research and evaluation focused on integration with application and network architectures, and authored articles on a variety of topics aimed at IT professionals. Her most recent area of focus included SOA-related products and architectures. She holds a B.S. in Information and Computing Science from the University of Wisconsin at Green Bay, and an M.S. in Computer Science from Nova Southeastern University.

@BigDataExpo Stories
Can the spatial component of your Big Data be harnessed and visualized, adding another dimension of power and analytics to your data? In his session at Big Data Expo®, John Meza, Product Engineer and Performance Engineering Team Lead at Esri, discussed the spatial queries that can be used within the Hadoop ecosystem and their integration with GeoSpatial applications. The GIS Tools for Hadoop project was also discussed and its implementation to discover location-based patterns and relationships...
Cloud and Big Data present unique dilemmas: embracing the benefits of these new technologies while maintaining the security of your organization's assets. When an outside party owns, controls and manages your infrastructure and computational resources, how can you be assured that sensitive data remains private and secure? How do you best protect data in mixed use cloud and big data infrastructure sets? Can you still satisfy the full range of reporting, compliance and regulatory requirements? In...
Storage administrators find themselves walking a line between meeting employees’ demands to use public cloud storage services, and their organizations’ need to store information on-premises for security, performance, cost and compliance reasons. However, as file sharing protocols like CIFS and NFS continue to lose their relevance, simply relying only on a NAS-based environment creates inefficiencies that hurt productivity and the bottom line. IT wants to implement cloud storage it can purchase a...
The emergence of cloud computing and Big Data warrants a greater role for the PMO to successfully manage enterprise transformation driven by these powerful trends. As the adoption of cloud-based services continues to grow, a governance model is needed to orchestrate enterprise cloud implementations and harness the power of Big Data analytics. In his session at Cloud Expo, Mahesh Singh, President of BigData, Inc., discussed how the Enterprise PMO takes center stage not only in developing the app...
In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect at GE, and Ibrahim Gokcen, who leads GE's advanced IoT analytics, focused on the Internet of Things / Industrial Internet and how to make it operational for business end-users. Learn about the challenges posed by machine and sensor data and how to marry it with enterprise data. They also discussed the tips and tricks to provide the Industrial Internet as an end-user consumable service using Big Data Analytics and Industrial C...
Are your Big Data initiatives resulting in Big Impact or Big Mess? In her session at Big Data Expo, Penelope Everall Gordon, Emerging Technology Strategist at 1Plug Corporation, shared her successes in improving Big Decision outcomes by building stories compelling to the target audience – and her failures when she lost sight of the plotline, distracted by the glitter of technology and the lure of buried insights. The cast of characters includes the agency head [city official? elected official?...
SYS-CON Events announced today that BMC will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. BMC delivers software solutions that help IT transform digital enterprises for the ultimate competitive business advantage. BMC has worked with thousands of leading companies to create and deliver powerful IT management services. From mainframe to cloud to mobile, BMC pairs high-speed digital innovation with robust...
We certainly live in interesting technological times. And no more interesting than the current competing IoT standards for connectivity. Various standards bodies, approaches, and ecosystems are vying for mindshare and positioning for a competitive edge. It is clear that when the dust settles, we will have new protocols, evolved protocols, that will change the way we interact with devices and infrastructure. We will also have evolved web protocols, like HTTP/2, that will be changing the very core...
How do APIs and IoT relate? The answer is not as simple as merely adding an API on top of a dumb device, but rather about understanding the architectural patterns for implementing an IoT fabric. There are typically two or three trends: Exposing the device to a management framework Exposing that management framework to a business centric logic Exposing that business layer and data to end users. This last trend is the IoT stack, which involves a new shift in the separation of what stuff happe...
Move from reactive to proactive cloud management in a heterogeneous cloud infrastructure. In his session at 16th Cloud Expo, Manoj Khabe, Innovative Solution-Focused Transformation Leader at Vicom Computer Services, Inc., will show how to replace a help desk-centric approach with an ITIL-based service model and service-centric CMDB that’s tightly integrated with an event and incident management platform. Learn how to expand the scope of operations management to service management. He will al...
The true value of the Internet of Things (IoT) lies not just in the data, but through the services that protect the data, perform the analysis and present findings in a usable way. With many IoT elements rooted in traditional IT components, Big Data and IoT isn’t just a play for enterprise. In fact, the IoT presents SMBs with the prospect of launching entirely new activities and exploring innovative areas. CompTIA research identifies several areas where IoT is expected to have the greatest impac...
Amazon and Google have built software-defined data centers (SDDCs) that deliver massively scalable services with great efficiency. Yet, building SDDCs has proven to be a near impossibility for companies without hyper-scale resources. In his session at 15th Cloud Expo, David Cauthron, CTO and Founder of NIMBOXX, highlighted how a mid-sized manufacturer of global industrial equipment bridged the gap from virtualization to software-defined services, streamlining operations and costs while connect...
The Industrial Internet revolution is now underway, enabled by connected machines and billions of devices that communicate and collaborate. The massive amounts of Big Data requiring real-time analysis is flooding legacy IT systems and giving way to cloud environments that can handle the unpredictable workloads. Yet many barriers remain until we can fully realize the opportunities and benefits from the convergence of machines and devices with Big Data and the cloud, including interoperability, ...
Discussions about cloud computing are evolving into discussions about enterprise IT in general. As enterprises increasingly migrate toward their own unique clouds, new issues such as the use of containers and microservices emerge to keep things interesting. In this Power Panel at 16th Cloud Expo, moderated by Conference Chair Roger Strukhoff, panelists will address the state of cloud computing today, and what enterprise IT professionals need to know about how the latest topics and trends affec...
While there are hundreds of public and private cloud hosting providers to choose from, not all clouds are created equal. If you’re seeking to host enterprise-level mission-critical applications, where Cloud Security is a primary concern, WHOA.com is setting new standards for cloud hosting, and has established itself as a major contender in the marketplace. We are constantly seeking ways to innovate and leverage state-of-the-art technologies. In his session at 16th Cloud Expo, Mike Rivera, Seni...
The Internet of Things is tied together with a thin strand that is known as time. Coincidentally, at the core of nearly all data analytics is a timestamp. When working with time series data there are a few core principles that everyone should consider, especially across datasets where time is the common boundary. In his session at Internet of @ThingsExpo, Jim Scott, Director of Enterprise Strategy & Architecture at MapR Technologies, discussed single-value, geo-spatial, and log time series dat...
All major researchers estimate there will be tens of billions devices - computers, smartphones, tablets, and sensors - connected to the Internet by 2020. This number will continue to grow at a rapid pace for the next several decades. With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo, June 9-11, 2015, at the Javits Center in New York City. Learn what is going on, contribute to the discussions, and ensure that your enter...
SYS-CON Events announced today that MetraTech, now part of Ericsson, has been named “Silver Sponsor” of SYS-CON's 16th International Cloud Expo®, which will take place on June 9–11, 2015, at the Javits Center in New York, NY. Ericsson is the driving force behind the Networked Society- a world leader in communications infrastructure, software and services. Some 40% of the world’s mobile traffic runs through networks Ericsson has supplied, serving more than 2.5 billion subscribers.
SYS-CON Events announced today that O'Reilly Media has been named “Media Sponsor” of SYS-CON's 16th International Cloud Expo®, which will take place on June 9–11, 2015, at the Javits Center in New York City, NY. O'Reilly Media spreads the knowledge of innovators through its books, online services, magazines, and conferences. Since 1978, O'Reilly Media has been a chronicler and catalyst of cutting-edge development, homing in on the technology trends that really matter and spurring their adoption...
Amazon, Google and Facebook are household names in part because of their mastery of Big Data. But what about organizations without billions of dollars to spend on Big Data tools - how can they extract value from their data? In his session at 6th Big Data Expo®, Ali Ghodsi, Co-Founder and Head of Engineering at Databricks, discussed how the zero management cost and scalability of the cloud is addressing the challenges and pain points that data engineers face when working with Big Data. He also s...