Welcome!

Big Data Journal Authors: Carmen Gonzalez, Elizabeth White, Dana Gardner, Liz McMillan, Esmeralda Swartz

Related Topics: Big Data Journal, Cloud Expo

Big Data Journal: Blog Feed Post

Why Big Data Applications Adoption is Accelerating

IDC methodology for sizing the Big Data technology and services market includes evaluation of current and expected deployments

Big Data applications have gained new momentum in the marketplace, as the benefits of working with larger and larger data sets enables analysts to spot key business-related trends. International Data Corporation (IDC) released a worldwide forecast of Big Data opportunities, noting that the market is expected to grow from $3.2 billion in 2010 to $16.9 billion in 2015.

This represents a compound annual growth rate (CAGR) of 40 percent -- or about 7 times that of the overall Information and Communications Technology (ICT) market.



"The Big Data market is expanding rapidly as large IT companies and start-ups vie for customers and market share," said Dan Vesset, program vice president, Business Analytics Solutions at IDC.

IDC believes that for business technology buyers, opportunities exist to use Big Data solutions to improve operational efficiency and to drive innovation. Use cases are already present across industries and geographic regions.

"There are also Big Data opportunities for both large IT vendors and start ups," Vesset continued. "Major IT vendors are offering both database solutions and configurations supporting Big Data by evolving their own products as well as by acquisition. At the same time, more than half a billion dollars in venture capital has been invested in new Big Data technology."

Findings from the latest IDC market study include:

  • While the five-year CAGR for the worldwide market is expected to be nearly 40 percent, the growth of individual segments varies from 27.3 percent for servers and 34.2 percent for software to 61.4 percent for storage.
  • The growth in appliances, cloud services, and outsourcing deals for Big Data technology will likely mean that over time end users will pay increasingly less attention to technology capabilities and will focus instead on the business value arguments. System performance, availability, security, and manageability will all matter greatly. However, how they are achieved will be less of a point for differentiation among vendors.
  • Today there is a shortage of trained Big Data technology experts, in addition to a shortage of analytics experts. This labor supply constraint will act as an inhibitor of adoption and use of Big Data technologies, and it will also encourage vendors to deliver Big Data technologies as cloud-based solutions.


"While software and services make up the bulk of the market opportunity through 2015, infrastructure technology for Big Data deployments is expected to grow slightly faster at 44 percent CAGR. Storage, in particular, shows the strongest growth opportunity, growing at 61.4 percent CAGR through 2015," said Benjamin S. Woo, program vice president, Storage Systems at IDC.

The significant growth rate in revenue is underscored by the large number of new open source projects that drive infrastructure investments.

Focus on Big Data Deployment Methodology

IDC methodology for sizing the Big Data technology and services market includes evaluation of current and expected deployments that follow one of the following three scenarios:

  1. Deployments where the data collected is over 100 terabytes (TB). IDC is using data collected, not stored, to account for the use of in-memory technology where data may not be stored on a disk.
  2. Deployments of ultra-high-speed messaging technology for real-time, streaming data capture and monitoring. This scenario represents Big Data in motion as opposed to Big Data at rest.
  3. Deployments where the data sets may not be very large today, but are growing very rapidly at a rate of 60 percent or more annually.

Additionally, IDC requires that in each of these three scenarios, the technology is deployed on scale-out infrastructure and deployments that include either two or more data types or data sources or those that include high-speed data sources such as click-stream tracking or monitoring of machine-generated data.

Read the original blog entry...

More Stories By David H Deans

David H. Deans is the Managing Director at the GeoActive Group. He has more than 25 years of experience in the Technology, Media and Telecom sectors.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Latest Stories from Big Data Journal
All major researchers estimate there will be tens of billions devices – computers, smartphones, tablets, and sensors – connected to the Internet by 2020. This number will continue to grow at a rapid pace for the next several decades. With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo in Silicon Valley. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be!...
Cloud computing is being adopted in one form or another by 94% of enterprises today. Tens of billions of new devices are being connected to The Internet of Things. And Big Data is driving this bus. An exponential increase is expected in the amount of information being processed, managed, analyzed, and acted upon by enterprise IT. This amazing is not part of some distant future it is happening today. One report shows a 650% increase in enterprise data by 2020. Other estimates are even higher....
Having just joined a large technology company with 20 years of history, it would be suicidal to believe that I can immediately move the entire organization to the DevOps mindset and model. For those not familiar with the term, “Eventual Consistency” is a model used in distributed computing to ensure high availability. In this context, it’s a model for replicating best practices and automation across IT teams and business units. The logical place to start with automation is the on-boarding of a ...
The Open Group and BriefingsDirect recently assembled a distinguished panel at The Open Group Boston Conference 2014 to explore the practical implications and limits of the Internet of Things. This so-called Internet of Things means more data, more cloud connectivity and management, and an additional tier of “things” that are going to be part of the mobile edge -- and extending that mobile edge ever deeper into even our own bodies. Yet the Internet of Things is more than the “things” – it me...
The emergence of cloud computing and Big Data warrants a greater role for the PMO to successfully manage enterprise transformation driven by these powerful trends. As the adoption of cloud-based services continues to grow, a governance model is needed to orchestrate enterprise cloud implementations and harness the power of Big Data analytics. In his session at 15th Cloud Expo, Mahesh Singh, President of BigData, Inc., to discuss how the Enterprise PMO takes center stage not only in developing th...
Fujitsu has a long and demonstrated history delivering world-class solutions that enable businesses to succeed in a highly competitive market and ever-evolving technology landscape. The Fujitsu Cloud ISV Partner Program is one more way we’re delivering exceptional value to our customers, where we focus on helping companies transform and deliver their solutions in an “as-a-service” model from our cloud. Our aim is to work closely with leading solution providers to take full advantage of not only ...
General Electric (GE) has been a household name for more than a century, thanks in large part to its role in making households easier to run. Starting with the light bulb invented by its founder, Thomas Edison, GE has been selling devices (“things”) to consumers throughout its 122-year history. Last week, GE announced that it is officially leaving that job to others. While the lighting division will stay, GE will now turn its attention to selling industrial machinery and analytics as a service t...
Come learn about what you need to consider when moving your data to the cloud. In her session at 15th Cloud Expo, Skyla Loomis, a Program Director of Cloudant Development at Cloudant, will discuss the security, performance, and operational implications of keeping your data on premise, moving it to the cloud, or taking a hybrid approach. She will use real customer examples to illustrate the tradeoffs, key decision points, and how to be successful with a cloud or hybrid cloud solution.
For the last hundred years, the desk phone has been a staple of every business. The landline has been a lifeline to customers and colleagues as the primary means of communication – even as email threatened to render the telephone obsolete. For some purposes, like conference calling, there was simply no substitute. That is, until a few years ago. With all due respect and apologies to Mr. Alexander Graham Bell, the desk phone is becoming just one solution, out of many devices, used for the modern...
Software is eating the world. Companies that were not previously in the technology space now find themselves competing with Google and Amazon on speed of innovation. As the innovation cycle accelerates, companies must embrace rapid and constant change to both applications and their infrastructure, and find a way to deliver speed and agility of development without sacrificing reliability or efficiency of operations. In her keynote DevOps Summit, Victoria Livschitz, CEO of Qubell, will discuss ho...