@DXWorldExpo Authors: Liz McMillan, Elizabeth White, Yeshim Deniz, Pat Romanski, Kevin Benedict

Related Topics: @DXWorldExpo, @CloudExpo, @ThingsExpo

@DXWorldExpo: Blog Feed Post

Data Lake: Save Me More Money vs. Make Me More Money By @Schmarzo | @BigDataExpo #BigData

The data lake is a centralized repository for all the organization’s data of interest whether internally or externally generated

2016 will be the year of the data lake. But I expect that much of 2016 data lake efforts will be focused on activities and projects that save the company more money. That is okay from a foundation perspective, but IT and Business will both miss the bigger opportunity to leverage the data lake (and its associated analytics) to make the company more money.

This blog examines an approach that allows organizations to quickly achieve some “save me more money” cost benefits from their data lake without losing sight of the bigger “make me more money” payoff – by coupling the data lake with data science to optimize key business processes, uncover new monetization opportunities and create a more compelling and differentiated customer experience.

Let’s start by quickly reviewing the concept of a data lake.

The Data Lake
The data lake is a centralized repository for all the organization’s data of interest, whether internally or externally generated. The data lake frees the advanced analytics and data science teams from being held captive to the data volume (detailed transactional history at the individual level), variety (structured and unstructured data) and velocity (real-time/right-time) constraints of the data warehouse. The data lake provides a line of demarcation that supports the traditional business intelligence/data warehouse environment (for operational and management reporting and dashboards) while enabling the organization’s new advanced analytics and data science capabilities (see Figure 1).

Bill 1

Figure 1: The Data Lake

The viability of a data lake was enabled by many factors including:

  • The development of Hadoop as a scale-out processing environment. Hadoop was developed and perfected by internet giants such as Google, Yahoo, eBay and Facebook to store, manage and analyze petabytes of web, search and social media data.
  • The dramatic cost savings using open source software (Hadoop, MapReduce, Pig, Python, HBase, etc.) running on commodity servers that yields a 20x to 50x cost advantage over traditional, proprietary data warehousing technologies .
  • The ability to load data as-is, which means that a schema does NOT need to be created prior to loading the data. This supports the rapid ingestion and analysis of a wide variety of structured and unstructured data sources.

The characteristics of a data lake include:

  • Ingest. Capture data from wide range of traditional (operational, transactional) and new sources (structured and unstructured) as-is
  • Store. Store all your data in one environment for cross-functional business analysis
  • Analyze. Support the analytics and data science to uncover new customer, product, and operational insights
  • Surface. Empower front-line employees and managers, and drive a more profitable customer engagement leveraging customer, product and operational insights
  • Act. Integrate analytic insights into operational (Finance, Manufacturing, Marketing, Sales Force, Procurement, Logistics) and management (Business Intelligence reports and dashboards) systems

Data Lake Foundation: Save Me More Money
Most companies today have some level of experience with Hadoop. And many of these companies are embracing the data lake in order to drive costs out of the organization. Some of these “save me more money” areas include:

  • Data enrichment and data transformation for activities such as converting unstructured text fields into a structured format or creating new composite metrics such as recency, frequency and sequencing of customer activities.
  • ETL (Extract, Transform, Load) offload from the data warehouse. It is estimated that ETL jobs consume 40% to 80% of all the data warehouse cycles. Organizations can realize an immediate value by moving the ETL jobs off of the expensive data warehouse to the data lake.
  • Data Archiving, which provides a lower-cost way to archive or store data for historical, compliance or regulatory purposes
  • Data discovery and data visualization that supports the ability to rapidly explore and visualize a wide variety of structured and unstructured data sources.
  • Data warehouse replacement. A growing number of organizations are leveraging open-source technologies such as Hive, HBase, HAWQ and Impala to move their business intelligence workloads off of the traditional RDBMS-based data warehouse to the Hadoop-based data lake.

These customers are dealing with what I will call “data lake 1.0,” which is a technology stack that includes storage, compute and Hadoop. The savings from these “Save me more money” activities can be nice with a Return on Investment (ROI) typically in the 10% to 20% range. But if organizations stop there, then they are leaving the 5x to 10x ROI projects on the table. Do I have your attention now?

Data Lake Game-changer: Make Me More Money
Leading organizations are transitioning their data lakes to what I call “data lake 2.0” which includes the data lake 1.0 technology foundation (storage, compute, Hadoop) plus the capabilities necessary to build business-centric, analytics-enabled applications. These additional data lake 2.0 capabilities include data science, data visualization, data governance, data engineering and application development. Data lake 2.0 supports the rapid development of analytics-enabled applications, built upon the Analytics “Hub and Spoke” data lake architecture that I introduced in my blog “Why Do I Need A Data Lake?” (see Figure 2).

Bill blog2

Figure 2: Analytics Hub and Spoke Architecture

Data lake 2.0 and the Analytics “Hub and Spoke” architecture supports the development of a wide range of analytics-enabled applications including:

  • Customer Acquisition
  • Customer Retention
  • Predictive Maintenance
  • Marketing Effectiveness
  • Customer Lifetime Value
  • Demand Forecasting
  • Network Optimization
  • Risk Reduction
  • Load Balancing
  • “Smart” Products
  • Pricing Optimization
  • Yield Optimization
  • Theft Reduction
  • Revenue Protection

Note: Some organizations (public sector, federal, military, etc.) don’t really have a “make me more money” charter; so for these organizations, the focus should be on “make me more efficient.”

Big Data Value Iceberg
The game-changing business value enabled big data isn’t found in the technology-centric data lake 1.0, or the top of the iceberg. Like an iceberg, the bigger business opportunities are hiding just under the surface in data lake 2.0 (see figure 3).

bill blog3

Figure 3: Data Lake Value Iceberg

The “Save Me More Money” projects are the typical domain of IT, and that is what data lake 1.0 can deliver. However if your organization is interested in the 10x-20x ROI “Make Me More Money” opportunities, then your organization needs to aggressively continue down the data lake path to get to data lake 2.0.
10x-20x ROI projects…do I have your attention now?

Data Lake: Save Me More Money vs. Make Me More Money
Bill Schmarzo

Read the original blog entry...

More Stories By William Schmarzo

Bill Schmarzo, author of “Big Data: Understanding How Data Powers Big Business” and “Big Data MBA: Driving Business Strategies with Data Science”, is responsible for setting strategy and defining the Big Data service offerings for Dell EMC’s Big Data Practice.

As a CTO within Dell EMC’s 2,000+ person consulting organization, he works with organizations to identify where and how to start their big data journeys. He’s written white papers, is an avid blogger and is a frequent speaker on the use of Big Data and data science to power an organization’s key business initiatives. He is a University of San Francisco School of Management (SOM) Executive Fellow where he teaches the “Big Data MBA” course. Bill also just completed a research paper on “Determining The Economic Value of Data”. Onalytica recently ranked Bill as #4 Big Data Influencer worldwide.

Bill has over three decades of experience in data warehousing, BI and analytics. Bill authored the Vision Workshop methodology that links an organization’s strategic business initiatives with their supporting data and analytic requirements. Bill serves on the City of San Jose’s Technology Innovation Board, and on the faculties of The Data Warehouse Institute and Strata.

Previously, Bill was vice president of Analytics at Yahoo where he was responsible for the development of Yahoo’s Advertiser and Website analytics products, including the delivery of “actionable insights” through a holistic user experience. Before that, Bill oversaw the Analytic Applications business unit at Business Objects, including the development, marketing and sales of their industry-defining analytic applications.

Bill holds a Masters Business Administration from University of Iowa and a Bachelor of Science degree in Mathematics, Computer Science and Business Administration from Coe College.

@BigDataExpo Stories
Dion Hinchcliffe is an internationally recognized digital expert, bestselling book author, frequent keynote speaker, analyst, futurist, and transformation expert based in Washington, DC. He is currently Chief Strategy Officer at the industry-leading digital strategy and online community solutions firm, 7Summits.
Widespread fragmentation is stalling the growth of the IIoT and making it difficult for partners to work together. The number of software platforms, apps, hardware and connectivity standards is creating paralysis among businesses that are afraid of being locked into a solution. EdgeX Foundry is unifying the community around a common IoT edge framework and an ecosystem of interoperable components.
DXWorldEXPO LLC announced today that Dez Blanchfield joined the faculty of CloudEXPO's "10-Year Anniversary Event" which will take place on November 11-13, 2018 in New York City. Dez is a strategic leader in business and digital transformation with 25 years of experience in the IT and telecommunications industries developing strategies and implementing business initiatives. He has a breadth of expertise spanning technologies such as cloud computing, big data and analytics, cognitive computing, m...
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
Cloud-enabled transformation has evolved from cost saving measure to business innovation strategy -- one that combines the cloud with cognitive capabilities to drive market disruption. Learn how you can achieve the insight and agility you need to gain a competitive advantage. Industry-acclaimed CTO and cloud expert, Shankar Kalyana presents. Only the most exceptional IBMers are appointed with the rare distinction of IBM Fellow, the highest technical honor in the company. Shankar has also receive...
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups.
Daniel Jones is CTO of EngineerBetter, helping enterprises deliver value faster. Previously he was an IT consultant, indie video games developer, head of web development in the finance sector, and an award-winning martial artist. Continuous Delivery makes it possible to exploit findings of cognitive psychology and neuroscience to increase the productivity and happiness of our teams.
The standardization of container runtimes and images has sparked the creation of an almost overwhelming number of new open source projects that build on and otherwise work with these specifications. Of course, there's Kubernetes, which orchestrates and manages collections of containers. It was one of the first and best-known examples of projects that make containers truly useful for production use. However, more recently, the container ecosystem has truly exploded. A service mesh like Istio addr...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Business professionals no longer wonder if they'll migrate to the cloud; it's now a matter of when. The cloud environment has proved to be a major force in transitioning to an agile business model that enables quick decisions and fast implementation that solidify customer relationships. And when the cloud is combined with the power of cognitive computing, it drives innovation and transformation that achieves astounding competitive advantage.
As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settlement products to hedge funds and investment banks. After, he co-founded a revenue cycle management company where he learned about Bitcoin and eventually Ethereal. Andrew's role at ConsenSys Enterprise is a mul...
Evan Kirstel is an internationally recognized thought leader and social media influencer in IoT (#1 in 2017), Cloud, Data Security (2016), Health Tech (#9 in 2017), Digital Health (#6 in 2016), B2B Marketing (#5 in 2015), AI, Smart Home, Digital (2017), IIoT (#1 in 2017) and Telecom/Wireless/5G. His connections are a "Who's Who" in these technologies, He is in the top 10 most mentioned/re-tweeted by CMOs and CIOs (2016) and have been recently named 5th most influential B2B marketeer in the US. H...
The best way to leverage your Cloud Expo presence as a sponsor and exhibitor is to plan your news announcements around our events. The press covering Cloud Expo and @ThingsExpo will have access to these releases and will amplify your news announcements. More than two dozen Cloud companies either set deals at our shows or have announced their mergers and acquisitions at Cloud Expo. Product announcements during our show provide your company with the most reach through our targeted audiences.
DevOpsSummit New York 2018, colocated with CloudEXPO | DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City. Digital Transformation (DX) is a major focus with the introduction of DXWorldEXPO within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term. A total of 88% of Fortune 500 companies from a generation ago are now out of bus...
With 10 simultaneous tracks, keynotes, general sessions and targeted breakout classes, @CloudEXPO and DXWorldEXPO are two of the most important technology events of the year. Since its launch over eight years ago, @CloudEXPO and DXWorldEXPO have presented a rock star faculty as well as showcased hundreds of sponsors and exhibitors! In this blog post, we provide 7 tips on how, as part of our world-class faculty, you can deliver one of the most popular sessions at our events. But before reading...
DXWorldEXPO LLC announced today that "Miami Blockchain Event by FinTechEXPO" has announced that its Call for Papers is now open. The two-day event will present 20 top Blockchain experts. All speaking inquiries which covers the following information can be submitted by email to [email protected] Financial enterprises in New York City, London, Singapore, and other world financial capitals are embracing a new generation of smart, automated FinTech that eliminates many cumbersome, slow, and expe...
DXWordEXPO New York 2018, colocated with CloudEXPO New York 2018 will be held November 11-13, 2018, in New York City and will bring together Cloud Computing, FinTech and Blockchain, Digital Transformation, Big Data, Internet of Things, DevOps, AI, Machine Learning and WebRTC to one location.