Welcome!

Big Data Journal Authors: Pat Romanski, Carmen Gonzalez, Esmeralda Swartz, Nitin Bandugula, Liz McMillan

Related Topics: SDN Journal, Java, .NET, Virtualization, Cloud Expo, Big Data Journal

SDN Journal: Blog Feed Post

Virtual Apostasy

When all you have is a hypervisor, everything looks like it should be virtualized

When all you have is a hypervisor, everything looks like it should be virtualized.

Yes, I'm about to say something that's on the order of heresy in the church of virtualization. But it has to be said and I'm willing to say it because, well, as General Patton said, "If everyone is thinking the same...   someone isn't thinking."

The original NFV white paper cited in the excellent overview of the SDN and NFV relationships "NFV and SDN: What’s the Difference?" describes essentially two problems it attempts to solve: rapid provisioning and operational costs.

The reason commodity hardware is always associated with NFV and with SDN is that, even if there existed a rainbow and unicorns industry-wide standard for managing network hardware there would still exist significant time required to acquire and deploy said hardware. One does not generally have extra firewalls, routers, switches, and application network service hardware lying around idle. One might, however, have commodity (cheap) compute available on which such services could be deployed.

Software, as we've seen, has readily adapted to distribution and deployment in a digital form factor. It wasn't always so after all. We started with floppies, moved to CD-ROM, then DVD and, finally, to neat little packages served up by application stores and centralized repositories (RPM, NPM, etc...).

Virtualization arrived just as we were moving from the physical to digital methods of distribution and it afforded us the commonality (abstraction) necessary to enable using commodity hardware for systems that might not otherwise be deployable on that hardware due to a lack of support by the operating system or the application itself. With the exposure of APIs and management via centralized platforms, the issue of provisioning speed was quickly addressed. Thus, virtualization is the easy answer to data center problems up and down the network stack.

But it isn't the only answer, and as SDN has shown there are other models that provide the same agility and cost benefits as virtualization without the potential downsides (performance being the most obvious with respect to the network).

ABSTRACT the ABSTRACTION

Let's abstract the abstraction for a moment. What is it virtualization offers that a similar, software-defined solution would not? If you're going to use raw compute, what is it that virtualization provides that makes it so appealing?

Hardware agnosticism comes to mind as a significant characteristic that leads everyone to choose virtualization as nearly a deus-ex machina solution. The idea that one can start with bare metal (raw compute) and within minutes have any of a number of very different systems up and running is compelling. Because there are hardware-specific drivers and configuration required at the OS level, however, that vision isn't easily realized. Enter virtualization, which provides a consistent, targetable layer for the operating system and applications.

Sure, it's software, but is standardizing on a hypervisor platform all that different from standardizing on a hardware platform?

We've turned the hypervisor into our common platform. It is what we target, what we've used as the "base" for deployment. It has eliminated the need to be concerned about five or ten hundred different potential board-level components requiring support and provided us a simple base platform upon which to deploy. But it hasn't eliminated dependencies; you can't deploy a VM packaged for VMware on a KVM system or vice-versa. There's still some virtual diaspora in the market that requires different targeted packages. But at least we're down to half-a-dozen from the hundreds of possible combinations at the hardware level.

But is it really virtualization that enables this magical deployment paradigm or is it the ability to deploy on common hardware it offers that's important? I'd say its the latter. It's the ability to deploy on commodity hardware that makes virtualization appealing. The hardware, however, still must exist. It must be racked and ready, available for that deployment. In terms of compute, we still have traditional roadblocks around ensuring compute capacity availability. The value up the operational process stack, as it were, of virtualization suddenly becomes more about readiness; about the ability to rapidly provision X or Y or Z because it's pre-packaged for the virtualization platform. In other words, it's the readiness factor that's key to rapid deployment. If there is sufficient compute (hardware) available and if the application/service/whatever is pre-packaged for the target virtualization platform then rapid deployment ensues.

Otherwise, you're sitting the same place you were before virtualization.

So there's significant planning that goes into being able to take advantage of virtualization's commoditization of compute to enable rapid deployment. And if we abstract what it is that enables virtualization to be the goodness that it is we find that it's about pre-packaging and a very finite targeted platform upon which services and applications can be deployed.

The question is, is that the only way to enable that capability?

Obviously I don't think so or I wouldn't be writing this post.

COMPLACENCY is the GREAT INHIBITOR of INNOVATION

What if we could remove the layer of virtualization, replacing it instead with a more robust and agile operating system capable of managing a bare metal deployment with the same (or even more) alacrity than a comparable virtualized system?

It seems that eliminating yet another layer of abstraction between the network function and, well, the network would be a good thing. Network functions at layer 2-3 are I/O bound; they're heavily reliant on fast input and output and that includes traversing the hardware up through the OS up through the hypervisor up through the... The more paths (and thus internal bus and lane traversals) a packet must travel in the system the higher the latency. Eliminating as many of these paths as possible is one of the keys*** to continued performance improvements on commodity hardware such that they are nearing those of network hardware.

If one had such a system that met the requirements - pre-packaged, rapid provisioning, able to run on commodity hardware - would you really need the virtual layer?

No.

But when all you have is a hypervisor...

I'm not saying virtualization isn't good technology, or that it doesn't make sense, or that it shouldn't be used. What I am saying is that perhaps we've become too quick to reach for the hammer when confronted with the challenge of rapid provisioning or flexibility. Let's not get complacent. We're far too early in the SDN and NFV game for that.

* Notice I did not say Sisyphean. It's doable, so it's on the order of Herculean. Unfortunately that also implies it's a long, arduous journey.

** That may be a tad hyperbolic, admittedly.

*** The operating system has a lot - a lot - to do with this equation, but that's a treatise for another day

Read the original blog entry...

More Stories By Lori MacVittie

Lori MacVittie is responsible for education and evangelism of application services available across F5’s entire product suite. Her role includes authorship of technical materials and participation in a number of community-based forums and industry standards organizations, among other efforts. MacVittie has extensive programming experience as an application architect, as well as network and systems development and administration expertise. Prior to joining F5, MacVittie was an award-winning Senior Technology Editor at Network Computing Magazine, where she conducted product research and evaluation focused on integration with application and network architectures, and authored articles on a variety of topics aimed at IT professionals. Her most recent area of focus included SOA-related products and architectures. She holds a B.S. in Information and Computing Science from the University of Wisconsin at Green Bay, and an M.S. in Computer Science from Nova Southeastern University.

@BigDataExpo Stories
DevOps means different things to different people. Qubell defines DevOps as the ability for the developer teams to do what they need to do to have this level of self-service. At DevOps Summit, Stan Klimoff, CTO of Qubell, demos the enterprise DevOps platform.
DevOps Summit 2015 New York, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that it is now accepting Keynote Proposals. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long development cycles that produce software that is obsolete...
SYS-CON Events announced today that that Innodisk, the service-driven provider of industrial embedded flash and DRAM storage products and technologies, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Innodisk is a service-driven provider of industrial embedded flash and DRAM storage products and technologies. With satisfied customers across the embedded, aerospace and defense, cloud storage markets an...
The 3rd International Internet of @ThingsExpo, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that its Call for Papers is now open. The Internet of Things (IoT) is the biggest idea since the creation of the Worldwide Web more than 20 years ago.
Eighty-five percent of companies store information in some sort of unstructured manner. In this demo at 15th Cloud Expo, Mark Fronczak, Product Manager at Solgenia, discussed their enterprise content management solution, which was created to help companies organize and take control of their digital assets.

ARMONK, N.Y., Nov. 20, 2014 /PRNewswire/ --  IBM (NYSE: IBM) today announced that it is bringing a greater level of control, security and flexibility to cloud-based application development and delivery with a single-tenant version of Bluemix, IBM's

An entirely new security model is needed for the Internet of Things, or is it? Can we save some old and tested controls for this new and different environment? In his session at @ThingsExpo, New York's at the Javits Center, Davi Ottenheimer, EMC Senior Director of Trust, reviewed hands-on lessons with IoT devices and reveal a new risk balance you might not expect. Davi Ottenheimer, EMC Senior Director of Trust, has more than nineteen years' experience managing global security operations and asse...
The adoption of the Internet Of Things (IoT) is growing and its growth is synonymous with the growth of cloud. As per predictions from IDC: IoT and the Cloud: Within the next five years, more than 90% of all IoT data will be hosted on service provider platforms as cloud computing reduces the complexity of supporting IoT "Data Blending." This means that any organization that wanted to transform themselves using IoT has to automatically embrace the cloud too, especially the public cloud. This b...
The Internet of Things promises to transform businesses (and lives), but navigating the business and technical path to success can be difficult to understand. In his session at @ThingsExpo, Sean Lorenz, Technical Product Manager for Xively at LogMeIn, demonstrated how to approach creating broadly successful connected customer solutions using real world business transformation studies including New England BioLabs and more.
We certainly live in interesting technological times. And no more interesting than the current competing IoT standards for connectivity. Various standards bodies, approaches, and ecosystems are vying for mindshare and positioning for a competitive edge. It is clear that when the dust settles, we will have new protocols, evolved protocols, that will change the way we interact with devices and infrastructure. We will also have evolved web protocols, like HTTP/2, that will be changing the very core...
NuoDB just introduced the Swifts 2.1 Release. In this demo at 15th Cloud Expo, Seth Proctor, CTO of NuoDB, Inc., discussed why scaling databases in the cloud is challenging, why building your application on top of the infrastructure that is designed with this in mind makes a difference, and what you can do with NuoDB that simplifies your programming model, your operations model.
The Internet of Things is a misnomer. That implies that everything is on the Internet, and that simply should not be - especially for things that are blurring the line between medical devices that stimulate like a pacemaker and quantified self-sensors like a pedometer or pulse tracker. The mesh of things that we manage must be segmented into zones of trust for sensing data, transmitting data, receiving command and control administrative changes, and peer-to-peer mesh messaging. In his session a...
Enthusiasm for the Internet of Things has reached an all-time high. In 2013 alone, venture capitalists spent more than $1 billion dollars investing in the IoT space. With "smart" appliances and devices, IoT covers wearable smart devices, cloud services to hardware companies. Nest, a Google company, detects temperatures inside homes and automatically adjusts it by tracking its user's habit. These technologies are quickly developing and with it come challenges such as bridging infrastructure gaps,...
“We are a managed services company. We have taken the key aspects of the cloud and the purposed data center and merged the two together and launched the Purposed Cloud about 18–24 months ago," explained Chetan Patwardhan, CEO of Stratogent, in this SYS-CON.tv interview at 15th Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
The 4th International DevOps Summit, co-located with16th International Cloud Expo – being held June 9-11, 2015, at the Javits Center in New York City, NY – announces that its Call for Papers is now open. Born out of proven success in agile development, cloud computing, and process automation, DevOps is a macro trend you cannot afford to miss. From showcase success stories from early adopters and web-scale businesses, DevOps is expanding to organizations of all sizes, including the world's large...
You use an agile process; your goal is to make your organization more agile. But what about your data infrastructure? The truth is, today's databases are anything but agile - they are effectively static repositories that are cumbersome to work with, difficult to change, and cannot keep pace with application demands. Performance suffers as a result, and it takes far longer than it should to deliver new features and capabilities needed to make your organization competitive. As your application an...
"SAP had made a big transition into the cloud as we believe it has significant value for our customers, drives innovation and is easy to consume. When you look at the SAP portfolio, SAP HANA is the underlying platform and it powers all of our platforms and all of our analytics," explained Thorsten Leiduck, VP ISVs & Digital Commerce at SAP, in this SYS-CON.tv interview at 15th Cloud Expo, held Nov 4-6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
"Desktop as a Service is emerging as a very big trend. One of the big influencers of this – for Esri – is that we have a large user base that uses virtualization and they are looking at Desktop as a Service right now," explained John Meza, Product Engineer at Esri, in this SYS-CON.tv interview at Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
SAP is delivering break-through innovation combined with fantastic user experience powered by the market-leading in-memory technology, SAP HANA. In his General Session at 15th Cloud Expo, Thorsten Leiduck, VP ISVs & Digital Commerce, SAP, discussed how SAP and partners provide cloud and hybrid cloud solutions as well as real-time Big Data offerings that help companies of all sizes and industries run better. SAP launched an application challenge to award the most innovative SAP HANA and SAP HANA...
More and more file-based and machine generated data is being created every day causing exponential data and content growth, and creating a management nightmare for IT managers. What data centers really need to cope with this growth is a purpose-built tiered archive appliance that enables users to establish a single storage target for all of their applications - an appliance that will intelligently place and move data to and between storage tiers based on user-defined policies. In her session a...