Click here to close now.


@BigDataExpo Authors: Liz McMillan, Jayaram Krishnaswamy, Carmen Gonzalez, Elizabeth White, Pat Romanski

Related Topics: @CloudExpo, Java IoT, Linux Containers, @BigDataExpo, SDN Journal

@CloudExpo: Blog Post

Cloud Wars – How Many 800lb Gorillas Can Fit in the Room?

In their latest magic quadrant report on IaaS, Gartner describes the market as still evolving and maturing

There is a common phrase, often attributed as a Chinese proverb/curse, "May you live in interesting times." For those following the cloud technology space, we are definitely living in interesting times. In their latest magic quadrant report on IaaS, Gartner describes the market as still evolving and maturing. This would imply the market leadership is in flux, yet according to Gartner "AWS is the overwhelming market share leader, with more than five times the compute capacity in use than the aggregate total of the other fourteen providers in this Magic Quadrant." I would say that makes AWS an 800lb gorilla in the room that the competition must get by. Many of those competitors could be considered 800lb gorillas in their own right. Microsoft, IBM, Google are not small companies, yet AWS has managed to create a dominant position in the marketplace.

Can AWS maintain that dominance? Who among the other ‘gorillas' could potentially knock AWS from its perch?

The Big Public Cage Match, IBM vs Amazon
As one would expect, the original 800lb gorilla, IBM, wants to be the one to knock AWS from that perch. Early last year, Amazon beat out IBM for a lucrative four year $600M cloud contract with the CIA. IBM immediately filed a bid protest in February, which was partially upheld by the GAO in June (A Big Win for Big Blue). The battle between the two behemoths continued through the summer. IBM worked to strengthen its IaaS credentials with the acquisition of SoftLayer. In October, what appeared to be a successful bid protest by IBM was overturned by the US Court of Federal claims, and IBM withdrew its injunctive action (IBM Steps back from CIA deal). The battle continued in November when IBM started a significant ad campaign, claiming it held a larger cloud business than Amazon. This blitz included running ads on buses in Las Vegas during Amazon's premier re:Invent conference. In January of this year, IBM committed to a 1.2B investment to expand their global cloud footprint.

The battle has created very diverse views in the industry as to who will finally win. Rob Enderle wrote a very compelling piece on why IBM will win the war with Amazon Web Services. He points out in their over 100-year history, IBM has battled many other disruptive competitors - a fact I am well aware of, being a former employee of Digital Equipment Corporation. Digital (DEC) rose in the '60s, disrupting the mainframe computing industry with a disruptive concept, the mini-computer. DEC eventually rose to being the number two computer manufacturer in the world (behind IBM). DEC is now a fond memory as it was since acquired by Compaq (a PC manufacturer) who was later acquired by Hewlett-Packard. IBM putting you in their sights is not to be taken lightly.

On the flip side, a very good counter argument to that viewpoint was written by David Linthicum, in his article Amazon Web Services has no reason to worry about IBM. One of the key points David makes is the argument that IBM will have difficulty adjusting to selling the cloud service model. He points out "the more cloud services that IBM sells, the less money it will make." In essence, it will displace existing IBM hardware and software with its own "public cloud offering." Add into this viewpoint, IBM doesn't always win. When Oracle first came on the scene, it disrupted the database world, and IBM came out guns a blazing. Oracle has not gone anywhere.

What About the Other Gorillas?
With all the coverage the IBM/AWS cage match has gotten this year, sometimes it's easy to forget there are other significant players in this marketplace. These players are not sitting back and waiting for the results of the IBM / AWS battle. Gartner analyst Lydia Long, in Where are the challengers to AWS?, states: "I think there's a critical shift happening in the market right now. Three very dangerous competitors are just now entering the market - Microsoft, Google and VMware. I think the real war for market share is just beginning." Forrester echoes a similar viewpoint. When viewing the market through a lens of the services provided (compute, RDBMS, storage), Forrester analyst Jeffery Hammond sees Microsoft and Google making strong inroads in the RDBMS and storage services space. As with IBM, Microsoft and Google have deep pockets to compete in this space and are not going to give up without a fight. What I find telling is that both these analysts did not even mention IBM vs. AWS, which has been getting the majority of the public attention. Google just announced a partner program that includes three tiers of third-party vendors providing technical and consulting services for Google's cloud platform.

Verizon Joins the Battle
Last October Verizon announced a new cloud offering built from the ground up to compete with AWS and the other IaaS vendors. This offering is different from their existing Verizon / Terramark cloud offering. The new offering is based on technology from CloudSwitch, a company Verizon acquired a little over two years ago. Verizon hopes to differentiate their offering by allowing the client to define specific performance capabilities around compute, I/O, memory and storage. Verizon states their technology allows them to avoid the ‘noisy neighbor' problem seen from other vendors, a not so subtle swipe at AWS.

In January Verizon announced a partnership with Oracle for their cloud environment. "Beginning in the first quarter of 2014, Oracle customers will be able to license Oracle Database 11g and 12C, Oracle Fusion Middleware and Oracle Enterprise Manager to run in Verizon's Managed Hosting and Enterprise Cloud virtual infrastructures, according to a Verizon document that details the Oracle partnership." While AWS does provide the ability to use Oracle 11g in their environment, the addition of the middleware components could be a key differentiator for Verizon.

This by itself is not a game changer, but is a sign that Verizon is serious about competing in this space. Verizon has its sights squarely in AWS's corner. In its announcement Verizon noted that it will continue to expand partnerships and the ecosystem around its cloud offerings. It quickly demonstrated that was just the first salvo. Verizon announced this week that it is extending its partnerships with CloudBees and CloudFoundry, including committing a monetary investment in CloudBees through its venture arm.

Net Neutrality, Could It Be a Game Changer?
In 2010 the FCC adopted the Preserving the Open Internet, Broadband Industry Practices as a means to enforce the concept of net neutrality on the Internet. This was in response to the practice of a variety of broadband providers (including Comcast and Verizon) that were throttling bandwidth usage based on the source. Some of these sources could have been considered competitors. The FCC net neutrality rule had three key components:

  • Transparency - providers must disclose network management and performance information
  • No Blocking - providers may not block lawful content and services
  • No Unreasonable Discrimination - providers may not unreasonably discriminate in transmitting lawful traffic

Now the federal appeals court has struck down that rule. The suit was brought by Verizon. It immediately raised a question in my mind. Does this mean Verizon, in theory, could throttle access to public cloud providers (such as AWS), and provide better access to their own public cloud services? Imagine the impact that could have on the marketplace. Verizon could effectively lock the other gorillas in a room and start their own room.

Will this happen? Not overnight, that's for sure. The FCC has already said they would appeal the decision. Additionally the court did give the FCC some wiggle room in modifying the rule in a way that would pass the courts muster. The battle over net neutrality is far from over, but how it finally gets resolved could have long-term impact for us and the public cloud providers.

Who Will Still Be Standing in Two Years?
One thing that is clear, the market is still in flux. Gartner predicts that one in four cloud providers will be gone by 2015. IBM has already demonstrated the consolidation direction with their purchase of SoftLayer. Concerns about the risk of a cloud provider still being around could start creating a self-fulfilling prophecy for the smaller vendors. The failure of cloud storage provider Nirvanix last year has put this concern at the forefront for many buyers. If the small vendors become acquired, or forced out of business over the next two years, we could easily end up with a room full of just the 800lb gorillas as the market shakes out. The question will then become, how many of these 800lb gorillas can we fit in the room?

More Stories By Ed Featherston

Ed Featherston is a senior enterprise architect and director at Collaborative Consulting. He brings more than 34 years of information technology experience of designing, building, and implementing large complex solutions. He has significant expertise in systems integration, Internet/intranet, client/server, middleware, and cloud technologies, Ed has designed and delivered projects for a variety of industries, including financial services, pharmacy, government and retail.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

@BigDataExpo Stories
Electric power utilities face relentless pressure on their financial performance, and reducing distribution grid losses is one of the last untapped opportunities to meet their business goals. Combining IoT-enabled sensors and cloud-based data analytics, utilities now are able to find, quantify and reduce losses faster – and with a smaller IT footprint. Solutions exist using Internet-enabled sensors deployed temporarily at strategic locations within the distribution grid to measure actual line lo...
Recently announced Azure Data Lake addresses the big data 3V challenges; volume, velocity and variety. It is one more storage feature in addition to blobs and SQL Azure database. Azure Data Lake (should have been Azure Data Ocean IMHO) is really omnipotent. Just look at the key capabilities of Azure Data Lake:
The Internet of Everything is re-shaping technology trends–moving away from “request/response” architecture to an “always-on” Streaming Web where data is in constant motion and secure, reliable communication is an absolute necessity. As more and more THINGS go online, the challenges that developers will need to address will only increase exponentially. In his session at @ThingsExpo, Todd Greene, Founder & CEO of PubNub, will explore the current state of IoT connectivity and review key trends an...
SYS-CON Events announced today that IBM Cloud Data Services has been named “Bronze Sponsor” of SYS-CON's 17th Cloud Expo, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. IBM Cloud Data Services offers a portfolio of integrated, best-of-breed cloud data services for developers focused on mobile computing and analytics use cases.
The Internet of Things (IoT) is growing rapidly by extending current technologies, products and networks. By 2020, Cisco estimates there will be 50 billion connected devices. Gartner has forecast revenues of over $300 billion, just to IoT suppliers. Now is the time to figure out how you’ll make money – not just create innovative products. With hundreds of new products and companies jumping into the IoT fray every month, there’s no shortage of innovation. Despite this, McKinsey/VisionMobile data...
Today air travel is a minefield of delays, hassles and customer disappointment. Airlines struggle to revitalize the experience. GE and M2Mi will demonstrate practical examples of how IoT solutions are helping airlines bring back personalization, reduce trip time and improve reliability. In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect with GE, and Dr. Sarah Cooper, M2Mi's VP Business Development and Engineering, will explore the IoT cloud-based platform technologies driv...
In their session at DevOps Summit, Asaf Yigal, co-founder and the VP of Product at, and Tomer Levy, co-founder and CEO of, will explore the entire process that they have undergone – through research, benchmarking, implementation, optimization, and customer success – in developing a processing engine that can handle petabytes of data. They will also discuss the requirements of such an engine in terms of scalability, resilience, security, and availability along with how the archi...
There will be 20 billion IoT devices connected to the Internet soon. What if we could control these devices with our voice, mind, or gestures? What if we could teach these devices how to talk to each other? What if these devices could learn how to interact with us (and each other) to make our lives better? What if Jarvis was real? How can I gain these super powers? In his session at 17th Cloud Expo, Chris Matthieu, co-founder and CTO of Octoblu, will show you!
SYS-CON Events announced today that Sandy Carter, IBM General Manager Cloud Ecosystem and Developers, and a Social Business Evangelist, will keynote at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA.
The IoT market is on track to hit $7.1 trillion in 2020. The reality is that only a handful of companies are ready for this massive demand. There are a lot of barriers, paint points, traps, and hidden roadblocks. How can we deal with these issues and challenges? The paradigm has changed. Old-style ad-hoc trial-and-error ways will certainly lead you to the dead end. What is mandatory is an overarching and adaptive approach to effectively handle the rapid changes and exponential growth.
Redis is not only the fastest database, but it has become the most popular among the new wave of applications running in containers. Redis speeds up just about every data interaction between your users or operational systems. In his session at 17th Cloud Expo, Dave Nielsen, Developer Relations at Redis Labs, will share the functions and data structures used to solve everyday use cases that are driving Redis' popularity
The IoT is upon us, but today’s databases, built on 30-year-old math, require multiple platforms to create a single solution. Data demands of the IoT require Big Data systems that can handle ingest, transactions and analytics concurrently adapting to varied situations as they occur, with speed at scale. In his session at @ThingsExpo, Chad Jones, chief strategy officer at Deep Information Sciences, will look differently at IoT data so enterprises can fully leverage their IoT potential. He’ll sha...
SYS-CON Events announced today that DataClear Inc. will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. The DataClear ‘BlackBox’ is the only solution that moves your PC, browsing and data out of the United States and away from prying (and spying) eyes. Its solution automatically builds you a clean, on-demand, virus free, new virtual cloud based PC outside of the United States, and wipes it clean...
SYS-CON Events announced today that Machkey International Company will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Machkey provides advanced connectivity solutions for just about everyone. Businesses or individuals, Machkey is dedicated to provide high-quality and cost-effective products to meet all your needs.
The enterprise is being consumerized, and the consumer is being enterprised. Moore's Law does not matter anymore, the future belongs to business virtualization powered by invisible service architecture, powered by hyperscale and hyperconvergence, and facilitated by vertical streaming and horizontal scaling and consolidation. Both buyers and sellers want instant results, and from paperwork to paperless to mindless is the ultimate goal for any seamless transaction. The sweetest sweet spot in innov...
The broad selection of hardware, the rapid evolution of operating systems and the time-to-market for mobile apps has been so rapid that new challenges for developers and engineers arise every day. Security, testing, hosting, and other metrics have to be considered through the process. In his session at Big Data Expo, Walter Maguire, Chief Field Technologist, HP Big Data Group, at Hewlett-Packard, will discuss the challenges faced by developers and a composite Big Data applications builder, foc...
Nowadays, a large number of sensors and devices are connected to the network. Leading-edge IoT technologies integrate various types of sensor data to create a new value for several business decision scenarios. The transparent cloud is a model of a new IoT emergence service platform. Many service providers store and access various types of sensor data in order to create and find out new business values by integrating such data.
Data loss happens, even in the cloud. In fact, if your company has adopted a cloud application in the past three years, data loss has probably happened, whether you know it or not. In his session at 17th Cloud Expo, Bryan Forrester, Senior Vice President of Sales at eFolder, will present how common and costly cloud application data loss is and what measures you can take to protect your organization from data loss.
There are so many tools and techniques for data analytics that even for a data scientist the choices, possible systems, and even the types of data can be daunting. In his session at @ThingsExpo, Chris Harrold, Global CTO for Big Data Solutions for EMC Corporation, will show how to perform a simple, but meaningful analysis of social sentiment data using freely available tools that take only minutes to download and install. Participants will get the download information, scripts, and complete en...
The cloud has reached mainstream IT. Those 18.7 million data centers out there (server closets to corporate data centers to colocation deployments) are moving to the cloud. In his session at 17th Cloud Expo, Achim Weiss, CEO & co-founder of ProfitBricks, will share how two companies – one in the U.S. and one in Germany – are achieving their goals with cloud infrastructure. More than a case study, he will share the details of how they prioritized their cloud computing infrastructure deployments ...

Tweets by @BigDataExpo

@BigDataExpo Blogs
Recently announced Azure Data Lake addresses the big data 3V challenges; volume, velocity and variety. It is one more storage feature in addition to blobs and SQL Azure database. Azure Data Lake (should have been Azure Data Ocean IMHO) is really omnipotent. Just look at the key capabilities of Azure Data Lake:
DevOps Summit at Cloud Expo 2014 Silicon Valley was a terrific event for us. The Qubell booth was crowded on all three days. We ran demos every 30 minutes with folks lining up to get a seat and usually standing around. It was great to meet and talk to over 500 people! My keynote was well received and so was Stan's joint presentation with RingCentral on Devops for BigData. I also participated in two Power Panels – ‘Women in Technology’ and ‘Why DevOps Is Even More Important than You Think,’ both featuring brilliant colleagues and moderators and it was a blast to be a part of.
Today air travel is a minefield of delays, hassles and customer disappointment. Airlines struggle to revitalize the experience. GE and M2Mi will demonstrate practical examples of how IoT solutions are helping airlines bring back personalization, reduce trip time and improve reliability. In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect with GE, and Dr. Sarah Cooper, M2Mi's VP Business Development and Engineering, will explore the IoT cloud-based platform technologies driving this change including privacy controls, data transparency and integration of real time context w...
I was recently watching one of my favorite science fiction TV shows (I’ll confess, ‘Dr. Who’). In classic dystopian fashion, there was a scene in which a young boy is running for his life across some barren ground in a war-ravaged world. One of his compatriots calls out to him to freeze, not to move another inch. The compatriot warns the young boy that he’s in a field of hand mines (no, that is not a typo, he did say hand mines). Slowly, dull gray hands with eyes in the palm start emerging from the ground around the boy and the compatriot. Suddenly, one of the hands grabs the compatriot and pu...
Today’s modern day industrial revolution is being shaped by ubiquitous connectivity, machine to machine (M2M) communications, the Internet of Things (IoT), open APIs leading to a surge in new applications and services, partnerships and eventual marketplaces. IoT has the potential to transform industry and society much like advances in steam technology, transportation, mass production and communications ushered in the industrial revolution in the 18th and 19th centuries.
Developing software for the Internet of Things (IoT) comes with its own set of challenges. Security, privacy, and unified standards are a few key issues. In addition, each IoT product is comprised of at least three separate application components: the software embedded in the device, the backend big-data service, and the mobile application for the end user's controls. Each component is developed by a different team, using different technologies and practices, and deployed to a different stack/target - this makes the integration of these separate pipelines and the coordination of software upd...
It’s not hard to find technology trade press commentary on the subject of Big Data. Variously defined (in non-technical terms) as the cluttered old shoebox of all data – and again (in more technical terms) as that amount of data that does not comfortably fit into a standard relational database for storage, processing and analytics within the normal constraints of processing, memory and data transport technologies – we can say that Big Data is an oft mentioned and sometimes misunderstood subject.
All we need to do is have our teams self-organize, and behold! Emergent design and/or architecture springs up out of the nothingness! If only it were that easy, right? I follow in the footsteps of so many people who have long wondered at the meanings of such simple words, as though they were dogma from on high. Emerge? Self-organizing? Profound, to be sure. But what do we really make of this sentence?
SCOPE is an acronym for Structured Computations Optimized for Parallel Execution, a declarative language for working with large-scale data. It is still under development at Microsoft. If you know SQL then working with SCOPE will be quite easy as SCOPE builds on SQL. The execution environment is different from that RDBMS oriented data. Data is still modeled as rows. Every row has typed columns and eveyr rowset has a well-defined schema. There is a SCOPe compiler that comes up with optimized execution plan and a runtime execution plan.
If you’re running Big Data applications, you’re going to want to look at some kind of distributed processing system. Hadoop is one of the best-known clustering systems, but how are you going to process all your data in a reasonable time frame? MapReduce has become a standard, perhaps the standard, for distributed file systems. While it’s a great system already, it’s really geared toward batch use, with jobs needing to queue for later output. This can severely hamper your flexibility. What if you want to explore some of your data? If it’s going to take all night, forget about it.
Disaster recovery (DR) has traditionally been a major challenge for IT departments. Even with the advent of server virtualization and other technologies that have simplified DR implementation and some aspects of on-going management, it is still a complex and (often extremely) costly undertaking. For those applications that do not require high availability, but are still mission- and business-critical, the decision as to which [applications] to spend money on for true disaster recovery can be a struggle.
Today’s connected world is moving from devices towards things, what this means is that by using increasingly low cost sensors embedded in devices we can create many new use cases. These span across use cases in cities, vehicles, home, offices, factories, retail environments, worksites, health, logistics, and health. These use cases rely on ubiquitous connectivity and generate massive amounts of data at scale. These technologies enable new business opportunities, ways to optimize and automate, along with new ways to engage with users.
“Cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications and services) that can be rapidly provisioned and released with minimal management.” While this definition is broadly accepted and has, in fact, been my adopted standard for years, it only describes technical aspects of cloud computing. The amalgamation of technologies used to deliver cloud services is not even half the story. Above all else, the successful employment requires a tight linkage to the econ...
Too many multinational corporations delete little, if any, data even though at its creation, more than 70 percent of this data is useless for business, regulatory or legal reasons.[1] The problem is hoarding, and what businesses need is their own “Hoarders” reality show about people whose lives are driven by their stuff[2] (corporations are legally people, after all). The goal of such an intervention (and this article)? Turning hoarders into collectors.
Organizations already struggle with the simple collection of data resulting from the proliferation of IoT, lacking the right infrastructure to manage it. They can't only rely on the cloud to collect and utilize this data because many applications still require dedicated infrastructure for security, redundancy, performance, etc. In his session at 17th Cloud Expo, Emil Sayegh, CEO of Codero Hosting, will discuss how in order to resolve the inherent issues, companies need to combine dedicated and cloud solutions through hybrid hosting – a sustainable solution for the data required to manage I...

About @BigDataExpo
Big Data focuses on how to use your own enterprise data – processed in the Cloud – most effectively to drive value for your business.