Welcome!

@BigDataExpo Authors: William Schmarzo, Karthick Viswanathan, Liz McMillan, Elizabeth White, Pat Romanski

Related Topics: @CloudExpo, Containers Expo Blog, SDN Journal, @DevOpsSummit

@CloudExpo: Article

What ‘Software-Defined’ Really Means | @CloudExpo #AI #SDN #SDX #DevOps

It’s time to bring some clarity into the big picture of SD – what it is, and perhaps even more importantly, what it is not.

The visual model to declarative metadata representation to immutable deployment vision is in essence what SD is all about.

The secret to making this approach practical, and thus the key to understanding why SD approaches have become so prevalent, is the word immutable.

Once we get an SD approach right, we no longer have to touch the deployed technology whatsoever. Instead, to make a change, update the model and redeploy.

In a recent Cortex, I bemoaned the fact that as buzzwords go, Digital Transformation is excessively vague. There is yet another buzzword of our times that is suffering the same fate: Software-Defined.

Rare though buzz-adjectives may be among the pantheon of buzz-nouns and the occasional buzz-verb, Software-Defined (SD) has become remarkably pervasive. In fact, it ties together many different, quite disparate concepts into what has become a vague mishmash.

It's time to bring some clarity into the big picture of SD - what it is, and perhaps even more importantly, what it is not.

The Many Uses of Software-Defined
The most concrete use of the SD adjective is perhaps in the phrase Software-Defined Networking (SDN). SDN separates network equipment's control plane (where routing instructions and other metadata go) from the data plane (where the data being routed go), and then shifts the entire control plane to centralized software.

The network, however, is only the beginning. We have SD infrastructure (SDI), SD data centers (SDDCs), SD wide-area networking (SD-WAN), and more. Each of these approaches follows the lead of SDN, shifting control of various pieces of hardware (or virtualized hardware) to centralized, software-based management and configuration applications.

SDI (which includes SDN), in fact, is at the core of cloud computing. Clearly, there's no way to scale a cloud data center if people had to run from server to server making changes.

Furthermore, Network Functions Virtualization (NFV) from the telco world also falls under the SD banner. With NFV, telco service providers shift all control to software, so that the underlying hardware is entirely generic. No more dedicated switches, routers, and specialized telco gear - all the hardware consist of generic, white-label boxes.

Software-Defined: Beyond the Network
While the network-centric context of SD in corporate networks, cloud data centers, and telco infrastructure forms the home base of the SD movement, SDI is also an essential enabler of continuous integration and continuous delivery (CI/CD), core elements of DevOps.

In order to achieve the velocity that CI/CD promise, the ops part of the story must be SD. Instead of ops people managing servers individually, the DevOps team must be able to deploy and manage software automatically via centralized software control. In other words, the immutable infrastructure principle behind DevOps is nothing more than SDI.

In fact, now that virtualization has matured, all the infrastructure from hypervisors down to bare metal is SD.

At the application level, however, the SD story gets more complicated.

Using software to automate the tasks involved in deploying software is nothing new. Developers have been using runbooks for years - scripts that tell various parts of the environment to execute a series of tasks in a particular sequence.

As DevOps has matured, the notion of the mundane runbook has taken on new life, as DevOps vendors automate increasingly broad swaths of the software development lifecycle (SDLC) with ‘recipes' or other scripting approaches.

As applications and the environments they run in get more complicated, however, the world of DevOps automation finds itself in a Catch-22: the automation scripts or recipes themselves become increasingly complex software applications in their own right, and thus must go through an SDLC of their own, with all the testing and governance that go along with it.

As a result, we're back to square one, manually creating, managing, deploying, and versioning software.

Does Software-Defined Mean Declarative?
To address this Catch-22, some DevOps tools take a declarative approach. Instead of scripting the environment step by step, the declarative approach enables the user to describe the desired behavior, and then the tool interprets such a description and takes the necessary actions to implement such behavior out of sight of the user.

In fact, in many contexts, when most vendors say SD, they really mean that they take a declarative approach, separating configuration from the underlying implementation. There's more to SD behavior than simply following a declarative approach, however.

For example, HTML (and markup languages in general) are declarative. And while we could certainly hand-code a web page by pecking out HTML, we're far more likely to use a visual tool for that purpose.

When we build a web site using such a tool, we're essentially working with models. The model is a visual, configurable representation of the page that the tool can convert into HTML for browsers to render into the page itself for users to view.

In this example, therefore, we have three different ways of thinking about the page: as a visual model, independent of any particular technology implementation of the page; as the HTML markup for that page; and as the action of the browser itself, an application purpose-built to render HTML into visual pages.

Architects and other shrewd readers will recognize the pattern above as being an instance of Model-Driven Architecture (MDA), or its common implementation, Model-Driven Development (MDD).

Does Software-Defined Mean Model-Driven?
MDA is an Object Management Group (OMG) standard
for creating metamodels that represent platform-independent models (our visual model, above) and platform-specific models (the HTML markup in the example), as well as an abstracted approach for turning the former into the latter.

Models, especially visual ones, are in broad use today, but MDA and MDD's best days are behind them. The reason: they didn't deal as well with change as MDA's creators had hoped.

In the MDD world, a developer might build a (platform-independent) model of an application in a model-driven tool and then push a button and out would pop the (platform-specific) source code that represented the working application.

However, if developers wanted to subsequently make a change, they would either need to change the model and regenerate and redeploy all the code (an onerous and time-consuming task), or tweak the auto-generated code itself, thus making it inconsistent with the model.

Round-trip tooling that would take tweaked code and automatically update the model - the holy grail of MDD - has proven impractical.

If we combine some of the principles from MDD with the declarative approach, however, we finally see some light at the end of the tunnel. Instead of the code-generating context of MDA reminiscent of CASE tools of yore, the platform-specific representation for a declarative model consists of a metadata representation of a configuration.

In practice, tools that take this approach create such metadata representations in JSON, XML, or a domain-specific language appropriate to the task at hand. Developers occasionally have reason to view such metadata, but rarely if ever have call to monkey with it directly.

Instead, users - who need not be developers - simply make changes in the model, typically via direct interaction with icons or other visual elements, or by selecting appropriate configurations. The underlying platform takes care of the rest.

The Intellyx Take
The round-trip code-generation vision of MDD proved unworkable, but the visual model to declarative metadata representation to immutable deployment vision is in essence what SD is all about.

The secret to making this approach practical, and thus the key to understanding why SD approaches have become so prevalent, is the word immutable.

Once we get an SD approach right, we no longer have to touch the deployed technology whatsoever. Instead, to make a change, update the model and redeploy.

The most important takeaway from this Cortex: this core SD pattern is fully generalizable. It works with networks, data centers, DevOps-based deployments, and as I'll cover in part two, it's also at the core of the Low-Code/No-Code movement.

It's no wonder, therefore, that Software-Defined Everything (SDX) is rising to the top of the buzzword heap - but SDX is no mere buzzword. It describes the central technological principles behind Agile Digital Transformation.

Copyright © Intellyx LLC. Intellyx publishes the Agile Digital Transformation Roadmap poster, advises companies on their digital transformation initiatives, and helps vendors communicate their agility stories. As of the time of writing, none of the organizations mentioned in this article are Intellyx customers. Image credit: Tim Adams.

More Stories By Jason Bloomberg

Jason Bloomberg is the leading expert on architecting agility for the enterprise. As president of Intellyx, Mr. Bloomberg brings his years of thought leadership in the areas of Cloud Computing, Enterprise Architecture, and Service-Oriented Architecture to a global clientele of business executives, architects, software vendors, and Cloud service providers looking to achieve technology-enabled business agility across their organizations and for their customers. His latest book, The Agile Architecture Revolution (John Wiley & Sons, 2013), sets the stage for Mr. Bloomberg’s groundbreaking Agile Architecture vision.

Mr. Bloomberg is perhaps best known for his twelve years at ZapThink, where he created and delivered the Licensed ZapThink Architect (LZA) SOA course and associated credential, certifying over 1,700 professionals worldwide. He is one of the original Managing Partners of ZapThink LLC, the leading SOA advisory and analysis firm, which was acquired by Dovel Technologies in 2011. He now runs the successor to the LZA program, the Bloomberg Agile Architecture Course, around the world.

Mr. Bloomberg is a frequent conference speaker and prolific writer. He has published over 500 articles, spoken at over 300 conferences, Webinars, and other events, and has been quoted in the press over 1,400 times as the leading expert on agile approaches to architecture in the enterprise.

Mr. Bloomberg’s previous book, Service Orient or Be Doomed! How Service Orientation Will Change Your Business (John Wiley & Sons, 2006, coauthored with Ron Schmelzer), is recognized as the leading business book on Service Orientation. He also co-authored the books XML and Web Services Unleashed (SAMS Publishing, 2002), and Web Page Scripting Techniques (Hayden Books, 1996).

Prior to ZapThink, Mr. Bloomberg built a diverse background in eBusiness technology management and industry analysis, including serving as a senior analyst in IDC’s eBusiness Advisory group, as well as holding eBusiness management positions at USWeb/CKS (later marchFIRST) and WaveBend Solutions (now Hitachi Consulting).

@BigDataExpo Stories
IT organizations are moving to the cloud in hopes to approve efficiency, increase agility and save money. Migrating workloads might seem like a simple task, but what many businesses don’t realize is that application migration criteria differs across organizations, making it difficult for architects to arrive at an accurate TCO number. In his session at 21st Cloud Expo, Joe Kinsella, CTO of CloudHealth Technologies, will offer a systematic approach to understanding the TCO of a cloud application...
SYS-CON Events announced today that Secure Channels, a cybersecurity firm, will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Secure Channels, Inc. offers several products and solutions to its many clients, helping them protect critical data from being compromised and access to computer networks from the unauthorized. The company develops comprehensive data encryption security strategie...
"With Digital Experience Monitoring what used to be a simple visit to a web page has exploded into app on phones, data from social media feeds, competitive benchmarking - these are all components that are only available because of some type of digital asset," explained Leo Vasiliou, Director of Web Performance Engineering at Catchpoint Systems, in this SYS-CON.tv interview at DevOps Summit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
SYS-CON Events announced today that WineSOFT will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Based in Seoul and Irvine, WineSOFT is an innovative software house focusing on internet infrastructure solutions. The venture started as a bootstrap start-up in 2010 by focusing on making the internet faster and more powerful. WineSOFT’s knowledge is based on the expertise of TCP/IP, VPN, S...
SYS-CON Events announced today that App2Cloud will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct. 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. App2Cloud is an online Platform, specializing in migrating legacy applications to any Cloud Providers (AWS, Azure, Google Cloud).
Cloud resources, although available in abundance, are inherently volatile. For transactional computing, like ERP and most enterprise software, this is a challenge as transactional integrity and data fidelity is paramount – making it a challenge to create cloud native applications while relying on RDBMS. In his session at 21st Cloud Expo, Claus Jepsen, Chief Architect and Head of Innovation Labs at Unit4, will explore that in order to create distributed and scalable solutions ensuring high availa...
An increasing number of companies are creating products that combine data with analytical capabilities. Running interactive queries on Big Data requires complex architectures to store and query data effectively, typically involving data streams, an choosing efficient file format/database and multiple independent systems that are tied together through custom-engineered pipelines. In his session at @BigDataExpo at @ThingsExpo, Tomer Levi, a senior software engineer at Intel’s Advanced Analytics ...
Internet-of-Things discussions can end up either going down the consumer gadget rabbit hole or focused on the sort of data logging that industrial manufacturers have been doing forever. However, in fact, companies today are already using IoT data both to optimize their operational technology and to improve the experience of customer interactions in novel ways. In his session at @ThingsExpo, Gordon Haff, Red Hat Technology Evangelist, shared examples from a wide range of industries – including en...
Detecting internal user threats in the Big Data eco-system is challenging and cumbersome. Many organizations monitor internal usage of the Big Data eco-system using a set of alerts. This is not a scalable process given the increase in the number of alerts with the accelerating growth in data volume and user base. Organizations are increasingly leveraging machine learning to monitor only those data elements that are sensitive and critical, autonomously establish monitoring policies, and to detect...
To get the most out of their data, successful companies are not focusing on queries and data lakes, they are actively integrating analytics into their operations with a data-first application development approach. Real-time adjustments to improve revenues, reduce costs, or mitigate risk rely on applications that minimize latency on a variety of data sources. Jack Norris reviews best practices to show how companies develop, deploy, and dynamically update these applications and how this data-first...
Intelligent Automation is now one of the key business imperatives for CIOs and CISOs impacting all areas of business today. In his session at 21st Cloud Expo, Brian Boeggeman, VP Alliances & Partnerships at Ayehu, will talk about how business value is created and delivered through intelligent automation to today’s enterprises. The open ecosystem platform approach toward Intelligent Automation that Ayehu delivers to the market is core to enabling the creation of the self-driving enterprise.
As businesses adopt functionalities in cloud computing, it’s imperative that IT operations consistently ensure cloud systems work correctly – all of the time, and to their best capabilities. In his session at @BigDataExpo, Bernd Harzog, CEO and founder of OpsDataStore, presented an industry answer to the common question, “Are you running IT operations as efficiently and as cost effectively as you need to?” He then expounded on the industry issues he frequently came up against as an analyst, and ...
The question before companies today is not whether to become intelligent, it’s a question of how and how fast. The key is to adopt and deploy an intelligent application strategy while simultaneously preparing to scale that intelligence. In her session at 21st Cloud Expo, Sangeeta Chakraborty, Chief Customer Officer at Ayasdi, will provide a tactical framework to become a truly intelligent enterprise, including how to identify the right applications for AI, how to build a Center of Excellence to ...
You know you need the cloud, but you’re hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You’re looking at private cloud solutions based on hyperconverged infrastructure, but you’re concerned with the limits inherent in those technologies.
SYS-CON Events announced today that Massive Networks will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Massive Networks mission is simple. To help your business operate seamlessly with fast, reliable, and secure internet and network solutions. Improve your customer's experience with outstanding connections to your cloud.
Because IoT devices are deployed in mission-critical environments more than ever before, it’s increasingly imperative they be truly smart. IoT sensors simply stockpiling data isn’t useful. IoT must be artificially and naturally intelligent in order to provide more value In his session at @ThingsExpo, John Crupi, Vice President and Engineering System Architect at Greenwave Systems, will discuss how IoT artificial intelligence (AI) can be carried out via edge analytics and machine learning techn...
Everything run by electricity will eventually be connected to the Internet. Get ahead of the Internet of Things revolution and join Akvelon expert and IoT industry leader, Sergey Grebnov, in his session at @ThingsExpo, for an educational dive into the world of managing your home, workplace and all the devices they contain with the power of machine-based AI and intelligent Bot services for a completely streamlined experience.
SYS-CON Events announced today that Datera, that offers a radically new data management architecture, has been named "Exhibitor" of SYS-CON's 21st International Cloud Expo ®, which will take place on Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Datera is transforming the traditional datacenter model through modern cloud simplicity. The technology industry is at another major inflection point. The rise of mobile, the Internet of Things, data storage and Big...
Existing Big Data solutions are mainly focused on the discovery and analysis of data. The solutions are scalable and highly available but tedious when swapping in and swapping out occurs in disarray and thrashing takes place. The resolution for thrashing through machine learning algorithms and support nomenclature is through simple techniques. Organizations that have been collecting large customer data are increasingly seeing the need to use the data for swapping in and out and thrashing occurs ...
In his session at @ThingsExpo, Arvind Radhakrishnen discussed how IoT offers new business models in banking and financial services organizations with the capability to revolutionize products, payments, channels, business processes and asset management built on strong architectural foundation. The following topics were covered: How IoT stands to impact various business parameters including customer experience, cost and risk management within BFS organizations.