Welcome!

@DXWorldExpo Authors: Zakia Bouachraoui, Yeshim Deniz, Carmen Gonzalez, Liz McMillan, Pat Romanski

Related Topics: @DXWorldExpo, Microservices Expo

@DXWorldExpo: Blog Feed Post

Big Data Redefined By @TonyShan | @CloudExpo [#BigData]

Big Data is a loose term for the collection, storage, processing, and sophisticated analysis of massive amounts of data

Big Data is a loose term for the collection, storage, processing, and sophisticated analysis of massive amounts of data, far larger and from many more kinds of sources than ever before. The definition of Big Data can be traced back to the 3Vs model defined by Doug Laney in 2001: Volume, Velocity, and Variety. The fourth V was later added in different fashions, such as “Value” or “Veracity”.

Interestingly the conceptualization of Big Data in the beginning of this century seems to gain wider use now after nearly 14 years. This sounds a little strange as the present dynamic world has evolved so much with so many things changed. Does the old definition still fit?

A recent report revealed that more than 80% of the executives surveyed thought that the term of Big Data was overstated, confusing, or misleading. They liked the concept, but hated the phrase. As Tom Davenport pointed out, nobody likes the term and almost everybody wishes for a better, more descriptive name for it.

The big problem of Big Data is that the V-model ineffectively describes the phenomenon and is outdated for the new paradigm. Even the original author admitted that he was simply writing about the burgeoning data in the data warehousing and business intelligence world. It is necessary to redefine the term.

Big Data in today’s world is essentially the ability to parse more information, faster and deeper, to provide unprecedented insights of the business world. The concept is more about 4Rs than 4Vs in the current situation: Real-time, Relevance, Revelation and Refinery.

  • Real-time: With the maturing and commoditization of distributed file systems and parallel processing functions, real-time is realistic. Instant response is a must for most online applications. Fast analysis is compulsory for any size of data nowadays. Batch mode becomes history now, except for cost constraints and due diligence reasons. Anything less than (near) real-time brings significant competitive disadvantages.

  • Relevance: Data analysis must be context-aware, semantic, and meaningful. Simple string match or syntactic equality is no longer enough. Unrelated data is useless as a distraction. It is mandatory for data analytics to be knowledge-based with relevant information analyzed. Interdisciplinary science and engineering must be leveraged to quantify the level of relevance in the data and user’s interest areas. Simply put, what matters the most is not how much data is delivered in the fastest way, but how applicable and useful the content is to an end user’s needs at the right time and in the right place.

  • Revelation: Previously unknown things are uncovered and disclosed in some form of knowledge not before realized. Hidden patterns are identified to correlate data elements and events at massive scale. Ambiguous, vague and obscure  data sets can be crystalized to provide better views and statistics. Seemingly random data can be mined to signal the potential linkage and interlock. User behaviors are analyzed via machine learning to find and understand the collaborative influence and sentiments.

  • Refinery: Raw data are extracted and transformed into relevant and actionable information effectively on demand. The refined data is timely, clean, aggregated, insightful and well understood. Data refinery takes the uncertainty out of the data and filter/reform the data for meaningful analysis and operations. The refinement output can be multi-structured to unlock the potential value and deepen the understanding. Data may be re-refined in a self-improved process based on the downstream needs and consumption context.

It is obvious that Big Data can be better characterized by 4Rs in the new era. For more information, please contact Tony Shan ([email protected]). ©Tony Shan. All rights reserved.

Slides: Tony Shan ‘Thinking in Big Data’

Download Slide Deck: ▸ Here

An effective way of thinking in Big Data is composed of a methodical framework for dealing with the predicted shortage of 50-60% of the qualified Big Data resources in the U.S.

This holistic model comprises the scientific and engineering steps that are involved in accelerating Big Data solutions: problem, diagnosis, facts, analysis, hypothesis, solution, prototype and implementation.

In his session at Big Data Expo®, Tony Shan focused on the concept, importance, and considerations for each of these eight components.

He will drill down to the key techniques and methods that are commonly used in these steps, such as root cause examination, process mapping, force field investigation, benchmarking, interview, brainstorming, focus group, Pareto chart, SWOT, impact evaluation, gap analysis, POC, and cost-benefit study.

Best practices and lessons learned from the real-world Big Data projects will also be discussed.

Read the original blog entry...

More Stories By Tony Shan

Tony Shan works as a senior consultant, advisor at a global applications and infrastructure solutions firm helping clients realize the greatest value from their IT. Shan is a renowned thought leader and technology visionary with a number of years of field experience and guru-level expertise on cloud computing, Big Data, Hadoop, NoSQL, social, mobile, SOA, BI, technology strategy, IT roadmapping, systems design, architecture engineering, portfolio rationalization, product development, asset management, strategic planning, process standardization, and Web 2.0. He has directed the lifecycle R&D and buildout of large-scale award-winning distributed systems on diverse platforms in Fortune 100 companies and public sector like IBM, Bank of America, Wells Fargo, Cisco, Honeywell, Abbott, etc.

Shan is an inventive expert with a proven track record of influential innovations such as Cloud Engineering. He has authored dozens of top-notch technical papers on next-generation technologies and over ten books that won multiple awards. He is a frequent keynote speaker and Chair/Panel/Advisor/Judge/Organizing Committee in prominent conferences/workshops, an editor/editorial advisory board member of IT research journals/books, and a founder of several user groups, forums, and centers of excellence (CoE).

DXWorldEXPO Digital Transformation Stories
At CloudEXPO Silicon Valley, June 24-26, 2019, Digital Transformation (DX) is a major focus with expanded DevOpsSUMMIT and FinTechEXPO programs within the DXWorldEXPO agenda. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term. A total of 88% of Fortune 500 companies from a generation ago are now out of business. Only 12% still survive. Similar percentages are found throug...
Every organization is facing their own Digital Transformation as they attempt to stay ahead of the competition, or worse, just keep up. Each new opportunity, whether embracing machine learning, IoT, or a cloud migration, seems to bring new development, deployment, and management models. The results are more diverse and federated computing models than any time in our history.
Data center, on-premise, public-cloud, private-cloud, multi-cloud, hybrid-cloud, IoT, AI, edge, SaaS, PaaS... it's an availability, security, performance and integration nightmare even for the best of the best IT experts. Organizations realize the tremendous benefits of everything the digital transformation has to offer. Cloud adoption rates are increasing significantly, and IT budgets are morphing to follow suit. But distributing applications and infrastructure around increases risk, introdu...
Today's workforce is trading their cubicles and corporate desktops in favor of an any-location, any-device work style. And as digital natives make up more and more of the modern workforce, the appetite for user-friendly, cloud-based services grows. The center of work is shifting to the user and to the cloud. But managing a proliferation of SaaS, web, and mobile apps running on any number of clouds and devices is unwieldy and increases security risks. PJ Hough, Citrix Executive Vice President and...
Atmosera delivers modern cloud services that maximize the advantages of cloud-based infrastructures. Offering private, hybrid, and public cloud solutions, Atmosera works closely with customers to engineer, deploy, and operate cloud architectures with advanced services that deliver strategic business outcomes. Atmosera's expertise simplifies the process of cloud transformation and our 20+ years of experience managing complex IT environments provides our customers with the confidence and trust tha...
Is your enterprise growing the right skills to fight the digital transformation (DX) battles? With 69% of enterprises describing the DX skill drought as being soft skills, rather than technology skills, are you ready to survive against disrupters? The next wave of business disruption is already crashing on your enterprise as AI, Blockchain and IoT change the nature and location of business. Now is the time to prepare. Drawing on experiences with large and midsize enterprises, Marco Coulter t...
Atmosera delivers modern cloud services that maximize the advantages of cloud-based infrastructures. Offering private, hybrid, and public cloud solutions, Atmosera works closely with customers to engineer, deploy, and operate cloud architectures with advanced services that deliver strategic business outcomes. Atmosera's expertise simplifies the process of cloud transformation and our 20+ years of experience managing complex IT environments provides our customers with the confidence and trust tha...
Where many organizations get into trouble, however, is that they try to have a broad and deep knowledge in each of these areas. This is a huge blow to an organization's productivity. By automating or outsourcing some of these pieces, such as databases, infrastructure, and networks, your team can instead focus on development, testing, and deployment. Further, organizations that focus their attention on these areas can eventually move to a test-driven development structure that condenses several l...
The graph represents a network of 1,329 Twitter users whose recent tweets contained "#DevOps", or who were replied to or mentioned in those tweets, taken from a data set limited to a maximum of 18,000 tweets. The network was obtained from Twitter on Thursday, 10 January 2019 at 23:50 UTC. The tweets in the network were tweeted over the 7-hour, 6-minute period from Thursday, 10 January 2019 at 16:29 UTC to Thursday, 10 January 2019 at 23:36 UTC. Additional tweets that were mentioned in this...
Over the course of two days, in addition to insightful conversations and presentations delving into the industry's current pressing challenges, there was considerable buzz about digital transformation and how it is enabling global enterprises to accelerate business growth. Blockchain has been a term that people hear but don't quite understand. The most common myths about blockchain include the assumption that it is private, or that there is only one blockchain, and the idea that blockchain is...