Welcome!

@DXWorldExpo Authors: Zakia Bouachraoui, Yeshim Deniz, Liz McMillan, Elizabeth White, Pat Romanski

Related Topics: @CloudExpo, Java IoT, Microsoft Cloud, Linux Containers, Containers Expo Blog, @DXWorldExpo

@CloudExpo: Article

Henry Ford & Bezos’s Law Signal It's Time to Ditch the Datacenter [#Cloud]

Total Cost of Infrastructure Ownership (TCIO) dramatically favors Cloud

Editor's note:  An abridged version of this post ran last week on the Gigaom blog.

With an ear to the ground and an eye on the sky, Gigaom's Barb Darrow chronicles the competitive factors shaping the bumpy journey that is cloud computing among the superpowers (AWS in fight of its life as customers like Dropbox ponder hybrid clouds and Google pricing). Wherever you stand on the debate over which cloud giant will reign supreme, it's clear the economic forces shaping the market are evolving quickly.

Now comes new cloud computing data based on Total Cost of Infrastructure (TCOI) proving cloud providers are innovating and reducing costs in areas beyond hardware. The result is a more compelling case for cloud as a far cheaper platform than a build-your-own datacenter. Further, the economic gap advantage favoring the cloud provider platform will widen over time.

In many ways, cloud computing is bringing to the enterprise world what Henry Ford did for cars. Ford developed and designed a method for manufacturing that steadily reduced the cost of manufacturing the Model T, thus lowering the price of his car. The result was a decline in the number of US auto manufacturers from more than 200 in the 1920s to just eight in 1940.  This astounding 96% reduction in manufacturers over 20 years foreshadows what could happen to enterprises running their own data centers in the not too distant future.

If you're still with me, here's why.

Previously, I posited that the future of cloud computing is the availability of more computing power at a much lower cost. This we call Bezos's law, defined as, "the history of cloud, a unit of computing power price is reduced by 50 percent approximately every three years."

Bezos's law measures the cost of a given unit of cloud computing over a period of time, as compared to Moore's Law, which we know is, "the number of transistors on integrated circuits over a period of time."

Bezos's law is a measure of the rate of change of Total Cost of Infrastructure Ownership (TCIO), while Moore's law measures the rate of change of CPU, a small fraction of the cost of a datacenter or cloud.

Why is TCIO so relevant?

The team from IBM SoftLayer commissioned McKinsey to do a study around TCIO.  The comprehensive analysis slide (below) highlights the following about total costs:

  • 30% for Labor
  • 35% for Hardware
  • 55% for Hardware and Facilities

Screenshot 2014-07-09 03.30.51.png

When considering the rate of Bezos's law in light of IBM's analysis, it is clear that cloud providers are innovating and reducing costs in areas beyond hardware.

There are obvious drivers ensuring the compounding trend line as described in Bezos's law will continue for many decades.

  • Scale: Amazon, Google, IBM and Microsoft are everyday adding huge amounts of capacity capable of running most Fortune 1,000 companies.
  • Innovation: The cloud market is competitive with innovative approaches and services being brought to market quickly.
  • Competition & Price Transparency: While the base IaaS service varies among providers, they are close enough for customers to easily compare offerings

Let's assume on average the Fortune 5000 each have seven datacenters for a total of 35,000.

Bezos's law will drive (think Henry Ford - Model T) a similar titanic shift form datacenter to cloud, which will result in 90% reduction (approximately 30,000) in enterprise owned and operated datacenters by 2030.

This is of course obvious given the Gartner prognostication about the future size of the cloud market (Gartner: Public cloud services to hit $131B by 2017).  There is likely to be new businesses dedicated to repurposing datacenters to retirement homes or new fangled dance clubs.

Just as people first thought automobiles were toys, early critics said the cloud would only be for limited use -- test/dev environments and spiky workloads. Now consensus is that the cloud can be for almost all applications. Early cars were expensive and unreliable, but the evidence reveals a compelling reduction in TCIO that put the whole country on wheels. It may be the end of the road for the datacenter, but the economic forces shaping the cloud signal it's the beginning of a better idea for the enterprise.

More Stories By Greg O'Connor

Greg O'Connor is President & CEO of AppZero. Pioneering the Virtual Application Appliance approach to simplifying application-lifecycle management, he is responsible for translating Appzero's vision into strategic business objectives and financial results.

O'Connor has over 25 years of management and technical experience in the computer industry. He was founder and president of Sonic Software, acquired in 2005 by Progress Software (PRGS). There he grew the company from concept to over $40 million in revenue.

At Sonic, he evangelized and created the Enterprise Service Bus (ESB) product category, which is generally accepted today as the foundation for Service Oriented Architecture (SOA). Follow him on Twitter @gregoryjoconnor.

DXWorldEXPO Digital Transformation Stories
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
As Cybric's Chief Technology Officer, Mike D. Kail is responsible for the strategic vision and technical direction of the platform. Prior to founding Cybric, Mike was Yahoo's CIO and SVP of Infrastructure, where he led the IT and Data Center functions for the company. He has more than 24 years of IT Operations experience with a focus on highly-scalable architectures.
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Enterprises are striving to become digital businesses for differentiated innovation and customer-centricity. Traditionally, they focused on digitizing processes and paper workflow. To be a disruptor and compete against new players, they need to gain insight into business data and innovate at scale. Cloud and cognitive technologies can help them leverage hidden data in SAP/ERP systems to fuel their businesses to accelerate digital transformation success.
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...