Welcome!

@BigDataExpo Authors: Liz McMillan, Elizabeth White, Yeshim Deniz, PagerDuty Blog, Pat Romanski

Related Topics: @BigDataExpo, @CloudExpo, Apache

@BigDataExpo: Article

The 'Big' Fallacy of Big Data | @BigDataExpo #BigData

Why companies are luring you into the Big Data Trap

Unless you've been living under a rock for the past couple of years, you've been hearing about the world of Big Data nonstop. Big Data promises fortune and power to those that can wield the somewhat mystical and often nebulous power of "Big Data". Unfortunately for the rest of us mere mortals Big Data is built on an out-right lie that is both pernicious and unfortunate. It's hiding right there in plain sight in the name itself. The word, BIG.

The Fallacy of Big Data is that you have to have a lot of data for it to be relevant. The common catch phrase is: "More data = more insights". There is a nugget of truth to this in that, in some cases, a lot of data is needed in order to establish valid patterns and create real insight into the activity the data represents. More often than not however, this creates a significant challenge to those responsible for performing analytics which is sifting through a mountain of data to find the parts that actually matter. Recent studies have shown that fully 80% of data analysis is spent just tinkering with the data to get it into a usable format. So we see that more data creates a massive data curation issue, and leaves us with more work to do to even start experimenting, much less monetizing our data.

The reality of "Big Data" is that it was invented by those with no skin in the game. Analytics, open source, digital transformation, and Cloud are all of the technologies that enable comprehensive data analysis. With minimal infrastructure, commodity hardware, and free or nearly free software to store, analyze, and more importantly drive value from that data, the big infrastructure players are left out in the cold with nothing to offer. Enter "Big Data", because if you are going to try and manage petabytes of data you need good storage, and 10's of thousands of servers is awful to manage. So the Fallacy is born:

"In order to get real results from data, you cannot rely on just a little bit of it, or just the relevant data, you need every set of data imaginable. Therefore, (and here's where things get squidgy) you need to bring all that data in house (because the cloud is too expensive to store it) and you need a lot of manageable and flexible enterprise-grade gear to do it with (because free stuff is not enterprise ready)."

You can see how this is built around some nuggets of truth. I was asked recently, "how would you move a petabyte of data to Amazon cloud storage?" and I answered as truthfully as I could, "Very Slowly". Cloud does get expensive when used for a lot of infrastructure, but when used as a part of the overall solution it is an important tool. Also the thought of managing a massive Hadoop cluster of 1000 "exactly the same" servers sounds like the hell of IT in the pre-VM days, but it is also not really an accurate picture of the Hadoop landscape. The vast majority of analytics clusters top out around 50 servers and that's far more manageable (and less expensive) than huge enterprise gear. To be fair, there are organizations out there where a massive-scale, enterprise platformed approach will make sense, but the unfortunate side effect of this approach by legacy vendors is that they have made the solution itself the barrier to entry.

The problem is that now "Big Data" has made it into the vernacular and worse yet, has become synonymous with Data Analytics. Every company, organization, or even individual on earth can benefit from analyzing their relevant data for new insights. Take a very simple example; look at your budget to identify where you overspend (too many meals out for example). That is personal analytics, it does not require complex anything, and there are numerous ways to do it with free or nearly free tools. Now scale that up to the bank that wants to offer new digital, data-driven products to customers. They already have a lot of that data in house, and they already have a lot of analytical tools. Why would they need, per-se, to include every data set under the sun? They may want some more sets of data (social media to identify trends that might lead to investment opportunity), but they don't HAVE to have it stored in house to use it - it is all offered free-to-use via serialized API's. In the unique case where if they did decide to store it all in house, we are not talking about 10's of PB of data. More like adding a few 10's to 100's of TB for the data in question, because again - you don't download all of Twitter, just the stuff that is relevant to you. Also analytic data is largely transient data, meaning that it is used for the analysis and then discarded (especially true in the real-time world), so where is the need for massive infrastructure to support that initiative?

I have spoken a lot about "Big Data" and the Fallacy and trap of paying too much attention to the word BIG. Data is important to everyone and it can have value for anyone. In my most recent speaking sessions I have shown how you can do a simple social analysis for free in a matter of minutes. You don't need a massive infrastructure to make that production ready either. It just takes some willingness to see through the noise to the actual value of what the "Big Data" message is trying to say. Analytics is important and valuable for everyone. You don't have to be a Fortune 100 company to create value from the data you already have, and to bring in new data for analytics. Everyone can do it.

For more thought provoking content on Big Data and Data Analytics, click here.

Connect with  me on Twitter or LinkedIn and share your thoughts!

More Stories By Christopher Harrold

As an Agent of IT Transformation, I have over 20 years experience in the field. Started off as the IT Ops guy and followed the trends of the DevOps movement wherever I went. I want to shake up accepted ways of thinking and develop new models and designs that push the boundaries of technology and of the accepted status quo. There is no greater reward for me than seeing something that was once dismissed as "impossible" become the new normal, and I have been richly rewarded throughout my career with this result. In my last role as CTO at EMC Corporation, I was working tirelessly with a small group of engineers and product managers to build a market leading, innovative platform for data analytics. Combining best of breed storage, analytics and visualization solutions that enables the Data as a Service model for enterprise and mid sized companies globally.

@BigDataExpo Stories
The essence of cloud computing is that all consumable IT resources are delivered as services. In his session at 15th Cloud Expo, Yung Chou, Technology Evangelist at Microsoft, demonstrated the concepts and implementations of two important cloud computing deliveries: Infrastructure as a Service (IaaS) and Platform as a Service (PaaS). He discussed from business and technical viewpoints what exactly they are, why we care, how they are different and in what ways, and the strategies for IT to transi...
The Internet of Things is clearly many things: data collection and analytics, wearables, Smart Grids and Smart Cities, the Industrial Internet, and more. Cool platforms like Arduino, Raspberry Pi, Intel's Galileo and Edison, and a diverse world of sensors are making the IoT a great toy box for developers in all these areas. In this Power Panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists discussed what things are the most important, which will have the most profound e...
Keeping pace with advancements in software delivery processes and tooling is taxing even for the most proficient organizations. Point tools, platforms, open source and the increasing adoption of private and public cloud services requires strong engineering rigor - all in the face of developer demands to use the tools of choice. As Agile has settled in as a mainstream practice, now DevOps has emerged as the next wave to improve software delivery speed and output. To make DevOps work, organization...
My team embarked on building a data lake for our sales and marketing data to better understand customer journeys. This required building a hybrid data pipeline to connect our cloud CRM with the new Hadoop Data Lake. One challenge is that IT was not in a position to provide support until we proved value and marketing did not have the experience, so we embarked on the journey ourselves within the product marketing team for our line of business within Progress. In his session at @BigDataExpo, Sum...
Extreme Computing is the ability to leverage highly performant infrastructure and software to accelerate Big Data, machine learning, HPC, and Enterprise applications. High IOPS Storage, low-latency networks, in-memory databases, GPUs and other parallel accelerators are being used to achieve faster results and help businesses make better decisions. In his session at 18th Cloud Expo, Michael O'Neill, Strategic Business Development at NVIDIA, focused on some of the unique ways extreme computing is...
Information technology (IT) advances are transforming the way we innovate in business, thereby disrupting the old guard and their predictable status-quo. It’s creating global market turbulence. Industries are converging, and new opportunities and threats are emerging, like never before. So, how are savvy chief information officers (CIOs) leading this transition? Back in 2015, the IBM Institute for Business Value conducted a market study that included the findings from over 1,800 CIO interviews ...
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm.
"We host and fully manage cloud data services, whether we store, the data, move the data, or run analytics on the data," stated Kamal Shannak, Senior Development Manager, Cloud Data Services, IBM, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
In his General Session at 17th Cloud Expo, Bruce Swann, Senior Product Marketing Manager for Adobe Campaign, explored the key ingredients of cross-channel marketing in a digital world. Learn how the Adobe Marketing Cloud can help marketers embrace opportunities for personalized, relevant and real-time customer engagement across offline (direct mail, point of sale, call center) and digital (email, website, SMS, mobile apps, social networks, connected objects).
SYS-CON Events announced today that Ocean9will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Ocean9 provides cloud services for Backup, Disaster Recovery (DRaaS) and instant Innovation, and redefines enterprise infrastructure with its cloud native subscription offerings for mission critical SAP workloads.
SYS-CON Events announced today that Auditwerx will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Auditwerx specializes in SOC 1, SOC 2, and SOC 3 attestation services throughout the U.S. and Canada. As a division of Carr, Riggs & Ingram (CRI), one of the top 20 largest CPA firms nationally, you can expect the resources, skills, and experience of a much larger firm combined with the accessibility and atten...
In his session at @ThingsExpo, Eric Lachapelle, CEO of the Professional Evaluation and Certification Board (PECB), will provide an overview of various initiatives to certifiy the security of connected devices and future trends in ensuring public trust of IoT. Eric Lachapelle is the Chief Executive Officer of the Professional Evaluation and Certification Board (PECB), an international certification body. His role is to help companies and individuals to achieve professional, accredited and worldw...
MongoDB Atlas leverages VPC peering for AWS, a service that allows multiple VPC networks to interact. This includes VPCs that belong to other AWS account holders. By performing cross account VPC peering, users ensure networks that host and communicate their data are secure. In his session at 20th Cloud Expo, Jay Gordon, a Developer Advocate at MongoDB, will explain how to properly architect your VPC using existing AWS tools and then peer with your MongoDB Atlas cluster. He'll discuss the secur...
Deep learning has been very successful in social sciences and specially areas where there is a lot of data. Trading is another field that can be viewed as social science with a lot of data. With the advent of Deep Learning and Big Data technologies for efficient computation, we are finally able to use the same methods in investment management as we would in face recognition or in making chat-bots. In his session at 20th Cloud Expo, Gaurav Chakravorty, co-founder and Head of Strategy Development ...
In his session at Cloud Expo, Alan Winters, an entertainment executive/TV producer turned serial entrepreneur, will present a success story of an entrepreneur who has both suffered through and benefited from offshore development across multiple businesses: The smart choice, or how to select the right offshore development partner Warning signs, or how to minimize chances of making the wrong choice Collaboration, or how to establish the most effective work processes Budget control, or how to m...
SYS-CON Events announced today that Linux Academy, the foremost online Linux and cloud training platform and community, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Linux Academy was founded on the belief that providing high-quality, in-depth training should be available at an affordable price. Industry leaders in quality training, provided services, and student certification passes, its goal is to c...
SYS-CON Events announced today that SoftLayer, an IBM Company, has been named “Gold Sponsor” of SYS-CON's 18th Cloud Expo, which will take place on June 7-9, 2016, at the Javits Center in New York, New York. SoftLayer, an IBM Company, provides cloud infrastructure as a service from a growing number of data centers and network points of presence around the world. SoftLayer’s customers range from Web startups to global enterprises.
SYS-CON Events announced today that CA Technologies has been named “Platinum Sponsor” of SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY, and the 21st International Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. CA Technologies helps customers succeed in a future where every business – from apparel to energy – is being rewritten by software. From ...
SYS-CON Events announced today that Technologic Systems Inc., an embedded systems solutions company, will exhibit at SYS-CON's @ThingsExpo, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Technologic Systems is an embedded systems company with headquarters in Fountain Hills, Arizona. They have been in business for 32 years, helping more than 8,000 OEM customers and building over a hundred COTS products that have never been discontinued. Technologic Systems’ pr...
In his keynote at @ThingsExpo, Chris Matthieu, Director of IoT Engineering at Citrix and co-founder and CTO of Octoblu, focused on building an IoT platform and company. He provided a behind-the-scenes look at Octoblu’s platform, business, and pivots along the way (including the Citrix acquisition of Octoblu).