Welcome!

@DXWorldExpo Authors: Liz McMillan, Yeshim Deniz, Pat Romanski, Elizabeth White, William Schmarzo

Related Topics: @DXWorldExpo, @CloudExpo

@DXWorldExpo: Blog Feed Post

Why Big Data Applications Adoption is Accelerating

IDC methodology for sizing the Big Data technology and services market includes evaluation of current and expected deployments

Big Data applications have gained new momentum in the marketplace, as the benefits of working with larger and larger data sets enables analysts to spot key business-related trends. International Data Corporation (IDC) released a worldwide forecast of Big Data opportunities, noting that the market is expected to grow from $3.2 billion in 2010 to $16.9 billion in 2015.

This represents a compound annual growth rate (CAGR) of 40 percent -- or about 7 times that of the overall Information and Communications Technology (ICT) market.



"The Big Data market is expanding rapidly as large IT companies and start-ups vie for customers and market share," said Dan Vesset, program vice president, Business Analytics Solutions at IDC.

IDC believes that for business technology buyers, opportunities exist to use Big Data solutions to improve operational efficiency and to drive innovation. Use cases are already present across industries and geographic regions.

"There are also Big Data opportunities for both large IT vendors and start ups," Vesset continued. "Major IT vendors are offering both database solutions and configurations supporting Big Data by evolving their own products as well as by acquisition. At the same time, more than half a billion dollars in venture capital has been invested in new Big Data technology." Findings from the latest IDC market study include:

  • While the five-year CAGR for the worldwide market is expected to be nearly 40 percent, the growth of individual segments varies from 27.3 percent for servers and 34.2 percent for software to 61.4 percent for storage.
  • The growth in appliances, cloud services, and outsourcing deals for Big Data technology will likely mean that over time end users will pay increasingly less attention to technology capabilities and will focus instead on the business value arguments. System performance, availability, security, and manageability will all matter greatly. However, how they are achieved will be less of a point for differentiation among vendors.
  • Today there is a shortage of trained Big Data technology experts, in addition to a shortage of analytics experts. This labor supply constraint will act as an inhibitor of adoption and use of Big Data technologies, and it will also encourage vendors to deliver Big Data technologies as cloud-based solutions.


"While software and services make up the bulk of the market opportunity through 2015, infrastructure technology for Big Data deployments is expected to grow slightly faster at 44 percent CAGR. Storage, in particular, shows the strongest growth opportunity, growing at 61.4 percent CAGR through 2015," said Benjamin S. Woo, program vice president, Storage Systems at IDC.

The significant growth rate in revenue is underscored by the large number of new open source projects that drive infrastructure investments.

Focus on Big Data Deployment Methodology

IDC methodology for sizing the Big Data technology and services market includes evaluation of current and expected deployments that follow one of the following three scenarios:

  1. Deployments where the data collected is over 100 terabytes (TB). IDC is using data collected, not stored, to account for the use of in-memory technology where data may not be stored on a disk.
  2. Deployments of ultra-high-speed messaging technology for real-time, streaming data capture and monitoring. This scenario represents Big Data in motion as opposed to Big Data at rest.
  3. Deployments where the data sets may not be very large today, but are growing very rapidly at a rate of 60 percent or more annually.

Additionally, IDC requires that in each of these three scenarios, the technology is deployed on scale-out infrastructure and deployments that include either two or more data types or data sources or those that include high-speed data sources such as click-stream tracking or monitoring of machine-generated data.

Read the original blog entry...

More Stories By David H Deans

David H. Deans is the Managing Director at the GeoActive Group. He has more than 25 years of experience in the Technology, Media and Telecom sectors.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


DXWorldEXPO Digital Transformation Stories
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
Bill Schmarzo, author of "Big Data: Understanding How Data Powers Big Business" and "Big Data MBA: Driving Business Strategies with Data Science," is responsible for setting the strategy and defining the Big Data service offerings and capabilities for EMC Global Services Big Data Practice. As the CTO for the Big Data Practice, he is responsible for working with organizations to help them identify where and how to start their big data journeys. He's written several white papers, is an avid blogge...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
When talking IoT we often focus on the devices, the sensors, the hardware itself. The new smart appliances, the new smart or self-driving cars (which are amalgamations of many ‘things'). When we are looking at the world of IoT, we should take a step back, look at the big picture. What value are these devices providing. IoT is not about the devices, its about the data consumed and generated. The devices are tools, mechanisms, conduits. This paper discusses the considerations when dealing with the...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Business professionals no longer wonder if they'll migrate to the cloud; it's now a matter of when. The cloud environment has proved to be a major force in transitioning to an agile business model that enables quick decisions and fast implementation that solidify customer relationships. And when the cloud is combined with the power of cognitive computing, it drives innovation and transformation that achieves astounding competitive advantage.
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
"Cloud computing is certainly changing how people consume storage, how they use it, and what they use it for. It's also making people rethink how they architect their environment," stated Brad Winett, Senior Technologist for DDN Storage, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
The cloud competition for database hosts is fierce. How do you evaluate a cloud provider for your database platform? In his session at 18th Cloud Expo, Chris Presley, a Solutions Architect at Pythian, gave users a checklist of considerations when choosing a provider. Chris Presley is a Solutions Architect at Pythian. He loves order – making him a premier Microsoft SQL Server expert. Not only has he programmed and administered SQL Server, but he has also shared his expertise and passion with bu...
Bill Schmarzo, author of "Big Data: Understanding How Data Powers Big Business" and "Big Data MBA: Driving Business Strategies with Data Science" is responsible for guiding the technology strategy within Hitachi Vantara for IoT and Analytics. Bill brings a balanced business-technology approach that focuses on business outcomes to drive data, analytics and technology decisions that underpin an organization's digital transformation strategy.