Welcome!

@DXWorldExpo Authors: Zakia Bouachraoui, Pat Romanski, Yeshim Deniz, Elizabeth White, Liz McMillan

Related Topics: @DXWorldExpo, Microservices Expo

@DXWorldExpo: Blog Feed Post

Big Data Redefined By @TonyShan | @CloudExpo [#BigData]

Big Data is a loose term for the collection, storage, processing, and sophisticated analysis of massive amounts of data

Big Data is a loose term for the collection, storage, processing, and sophisticated analysis of massive amounts of data, far larger and from many more kinds of sources than ever before. The definition of Big Data can be traced back to the 3Vs model defined by Doug Laney in 2001: Volume, Velocity, and Variety. The fourth V was later added in different fashions, such as “Value” or “Veracity”.

Interestingly the conceptualization of Big Data in the beginning of this century seems to gain wider use now after nearly 14 years. This sounds a little strange as the present dynamic world has evolved so much with so many things changed. Does the old definition still fit?

A recent report revealed that more than 80% of the executives surveyed thought that the term of Big Data was overstated, confusing, or misleading. They liked the concept, but hated the phrase. As Tom Davenport pointed out, nobody likes the term and almost everybody wishes for a better, more descriptive name for it.

The big problem of Big Data is that the V-model ineffectively describes the phenomenon and is outdated for the new paradigm. Even the original author admitted that he was simply writing about the burgeoning data in the data warehousing and business intelligence world. It is necessary to redefine the term.

Big Data in today’s world is essentially the ability to parse more information, faster and deeper, to provide unprecedented insights of the business world. The concept is more about 4Rs than 4Vs in the current situation: Real-time, Relevance, Revelation and Refinery.

  • Real-time: With the maturing and commoditization of distributed file systems and parallel processing functions, real-time is realistic. Instant response is a must for most online applications. Fast analysis is compulsory for any size of data nowadays. Batch mode becomes history now, except for cost constraints and due diligence reasons. Anything less than (near) real-time brings significant competitive disadvantages.

  • Relevance: Data analysis must be context-aware, semantic, and meaningful. Simple string match or syntactic equality is no longer enough. Unrelated data is useless as a distraction. It is mandatory for data analytics to be knowledge-based with relevant information analyzed. Interdisciplinary science and engineering must be leveraged to quantify the level of relevance in the data and user’s interest areas. Simply put, what matters the most is not how much data is delivered in the fastest way, but how applicable and useful the content is to an end user’s needs at the right time and in the right place.

  • Revelation: Previously unknown things are uncovered and disclosed in some form of knowledge not before realized. Hidden patterns are identified to correlate data elements and events at massive scale. Ambiguous, vague and obscure  data sets can be crystalized to provide better views and statistics. Seemingly random data can be mined to signal the potential linkage and interlock. User behaviors are analyzed via machine learning to find and understand the collaborative influence and sentiments.

  • Refinery: Raw data are extracted and transformed into relevant and actionable information effectively on demand. The refined data is timely, clean, aggregated, insightful and well understood. Data refinery takes the uncertainty out of the data and filter/reform the data for meaningful analysis and operations. The refinement output can be multi-structured to unlock the potential value and deepen the understanding. Data may be re-refined in a self-improved process based on the downstream needs and consumption context.

It is obvious that Big Data can be better characterized by 4Rs in the new era. For more information, please contact Tony Shan ([email protected]). ©Tony Shan. All rights reserved.

Slides: Tony Shan ‘Thinking in Big Data’

Download Slide Deck: ▸ Here

An effective way of thinking in Big Data is composed of a methodical framework for dealing with the predicted shortage of 50-60% of the qualified Big Data resources in the U.S.

This holistic model comprises the scientific and engineering steps that are involved in accelerating Big Data solutions: problem, diagnosis, facts, analysis, hypothesis, solution, prototype and implementation.

In his session at Big Data Expo®, Tony Shan focused on the concept, importance, and considerations for each of these eight components.

He will drill down to the key techniques and methods that are commonly used in these steps, such as root cause examination, process mapping, force field investigation, benchmarking, interview, brainstorming, focus group, Pareto chart, SWOT, impact evaluation, gap analysis, POC, and cost-benefit study.

Best practices and lessons learned from the real-world Big Data projects will also be discussed.

Read the original blog entry...

More Stories By Tony Shan

Tony Shan works as a senior consultant, advisor at a global applications and infrastructure solutions firm helping clients realize the greatest value from their IT. Shan is a renowned thought leader and technology visionary with a number of years of field experience and guru-level expertise on cloud computing, Big Data, Hadoop, NoSQL, social, mobile, SOA, BI, technology strategy, IT roadmapping, systems design, architecture engineering, portfolio rationalization, product development, asset management, strategic planning, process standardization, and Web 2.0. He has directed the lifecycle R&D and buildout of large-scale award-winning distributed systems on diverse platforms in Fortune 100 companies and public sector like IBM, Bank of America, Wells Fargo, Cisco, Honeywell, Abbott, etc.

Shan is an inventive expert with a proven track record of influential innovations such as Cloud Engineering. He has authored dozens of top-notch technical papers on next-generation technologies and over ten books that won multiple awards. He is a frequent keynote speaker and Chair/Panel/Advisor/Judge/Organizing Committee in prominent conferences/workshops, an editor/editorial advisory board member of IT research journals/books, and a founder of several user groups, forums, and centers of excellence (CoE).

DXWorldEXPO Digital Transformation Stories
Dynatrace is an application performance management software company with products for the information technology departments and digital business owners of medium and large businesses. Building the Future of Monitoring with Artificial Intelligence. Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more busine...
Nicolas Fierro is CEO of MIMIR Blockchain Solutions. He is a programmer, technologist, and operations dev who has worked with Ethereum and blockchain since 2014. His knowledge in blockchain dates to when he performed dev ops services to the Ethereum Foundation as one the privileged few developers to work with the original core team in Switzerland.
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
ICC is a computer systems integrator and server manufacturing company focused on developing products and product appliances to meet a wide range of computational needs for many industries. Their solutions provide benefits across many environments, such as datacenter deployment, HPC, workstations, storage networks and standalone server installations. ICC has been in business for over 23 years and their phenomenal range of clients include multinational corporations, universities, and small busines...
DXWorldEXPO LLC announced today that Nutanix has been named "Platinum Sponsor" of CloudEXPO | DevOpsSUMMIT | DXWorldEXPO New York, which will take place November 12-13, 2018 in New York City. Nutanix makes infrastructure invisible, elevating IT to focus on the applications and services that power their business. The Nutanix Enterprise Cloud Platform blends web-scale engineering and consumer-grade design to natively converge server, storage, virtualization and networking into a resilient, softwar...
Daniel Jones is CTO of EngineerBetter, helping enterprises deliver value faster. Previously he was an IT consultant, indie video games developer, head of web development in the finance sector, and an award-winning martial artist. Continuous Delivery makes it possible to exploit findings of cognitive psychology and neuroscience to increase the productivity and happiness of our teams.
Serveless Architectures brings the ability to independently scale, deploy and heal based on workloads and move away from monolithic designs. From the front-end, middle-ware and back-end layers, serverless workloads potentially have a larger security risk surface due to the many moving pieces. This talk will focus on key areas to consider for securing end to end, from dev to prod. We will discuss patterns for end to end TLS, session management, scaling to absorb attacks and mitigation techniques.
Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settlement products to hedge funds and investment banks. After, he co-founded a revenue cycle management company where he learned about Bitcoin and eventually Ethereal. Andrew's role at ConsenSys Enterprise is a mul...
When building large, cloud-based applications that operate at a high scale, it’s important to maintain a high availability and resilience to failures. In order to do that, you must be tolerant of failures, even in light of failures in other areas of your application. “Fly two mistakes high” is an old adage in the radio control airplane hobby. It means, fly high enough so that if you make a mistake, you can continue flying with room to still make mistakes. In his session at 18th Cloud Expo, Lee A...
Whenever a new technology hits the high points of hype, everyone starts talking about it like it will solve all their business problems. Blockchain is one of those technologies. According to Gartner's latest report on the hype cycle of emerging technologies, blockchain has just passed the peak of their hype cycle curve. If you read the news articles about it, one would think it has taken over the technology world. No disruptive technology is without its challenges and potential impediments t...