Welcome!

Big Data Journal Authors: Yeshim Deniz, Shelly Palmer, Elizabeth White, Jnan Dash, Michael Bushong

Blog Feed Post

Deja VVVu: Others Claiming Gartner’s Construct for Big Data

By

This article originally appeared on the Gartner Blog Network in January 2012 and is reprinted here with permission from Gartner and its author Doug Laney

In the late 1990s, while a META Group analyst (Note: META is now part of Gartner), it was becoming evident that our clients increasingly were encumbered by their data assets.  While many pundits were talking about, many clients were lamenting, and many vendors were seizing the opportunity of these fast-growing data stores, I also realized that something else was going on. Sea changes in the speed at which data was flowing mainly due to electronic commerce, along with the increasing breadth of data sources, structures and formats due to the post Y2K-ERP application boom were as or more challenging to data management teams than was the increasing quantity of data.

In an attempt to help our clients get a handle on how to recognize, and more importantly, deal with these challenges I began first speaking at industry conferences on this 3-dimensional data challenge of increasing data volume, velocity and variety.  Then in late 2000 I drafted a research note published in February 2001 entitled 3-D Data Management: Controlling Data Volume, Velocity and Variety.

Fast forward to today:  The “3V’s” framework for understanding and dealing with Big Data has now become ubiquitous.  In fact, other research firms, major vendors and consulting firms have even posited the 3Vs (or an unmistakable variant) as their own concept.  Since the original piece is no longer available in Gartner archives but is in increasing demand, I wanted to make it available here for anyone to reference and cite:

Original Research Note PDF: 3-D Data Management: Controlling Data Volume, Velocity and Variety

Date: 6 February 2001     Author: Doug Laney

3-D Data Management: Controlling Data Volume, Velocity and Variety. Current business conditions and mediums are pushing traditional data management principles to their limits, giving rise to novel and more formalized approaches.

META Trend: During 2001/02, leading enterprises will increasingly use a centralized data warehouse to define a common business vocabulary that improves internal and external collaboration. Through 2003/04, data quality and integration woes will be tempered by data profiling technologies (for generating metadata, consolidated schemas, and integration logic) and information logistics agents. By 2005/06, data, document, and knowledge management will coalesce, driven by schema-agnostic indexing strategies and portal maturity.

The effect of the e-commerce surge, a rise in merger & acquisition activity, increased collaboration, and the drive for harnessing information as a competitive catalyst is driving enterprises to higher levels of consciousness about how data is managed at its most basic level.  In 2001-02, historical, integrated databases (e.g. data warehouses, operational data stores, data marts), will be leveraged not only for intended analytical purposes, but increasingly for intra-enterprise consistency and coordination. By 2003-04, these structures (including their associated metadata) will be on par with application portfolios, organization charts and procedure manuals for defining a business to its employees and affiliates.

Data records, data structures, and definitions commonly accepted throughout an enterprise reduce fiefdoms pulling against each other due to differences in the way each perceives where the enterprise has been, is presently, and is headed.  Readily accessible current and historical records of transactions, affiliates (partners, employees, customers, suppliers), business processes (or rules), along with definitional and navigational metadata (see ADS Delta 896, 21st Century Metadata: Mapping the Enterprise Genome, 7 Aug 2000) enable employees to paddle in the same direction.  Conversely, application-specific data stores (e.g. accounts receivable versus order status), geographic-specific data stores (e.g. North American sales vs. International sales), offer conflicting, or insular views of the enterprise, that while important for feeding transactional systems, provide no “single version of the truth,” giving rise to inconsistency in the way enterprise factions function.

While enterprises struggle to consolidate systems and collapse redundant databases to enable greater operational, analytical, and collaborative consistencies, changing economic conditions have made this job more difficult.  E-commerce, in particular, has exploded data management challenges along three dimensions: volumes, velocity and variety.  In 2001/02, IT organizations must compile a variety of approaches to have at their disposal for dealing with each.

Data Volume

E-commerce channels increase the depth and breadth of data available about a transaction (or any point of interaction). The lower cost of e-channels enables and enterprise to offer its goods or services to more individuals or trading partners, and up to 10x the quantity of data about an individual transaction may be collected—thereby increasing the overall volume of data to be managed.  Furthermore, as enterprises come to see information as a tangible asset, they become reluctant to discard it.

Typically, increases in data volume are handled by purchasing additional online storage.  However as data volume increases, the relative value of each data point decreases proportionately—resulting in a poor financial justification for merely incrementing online storage. Viable alternates and supplements to hanging new disk include:

  • Implementing tiered storage systems (see SIS Delta 860, 19 Apr 2000) that cost effectively balance levels of data utility with data availability using a variety of media.
  • Limiting data collected to that which will be leveraged by current or imminent business processes
  • Limiting certain analytic structures to a percentage of statistically valid sample data.
  • Profiling data sources to identify and subsequently eliminate redundancies
  • Monitoring data usage to determine “cold spots” of unused data that can be eliminated or offloaded to tape (e.g. Ambeo, BEZ Systems, Teleran)
  • Outsourcing data management altogether (e.g. EDS, IBM)

Data Velocity

E-commerce has also increased point-of-interaction (POI) speed, and consequently the pace data used to support interactions and generated by interactions. As POI performance is increasingly perceived as a competitive differentiator (e.g. Web site response, inventory availability analysis, transaction execution, order tracking update, product/service delivery, etc.) so too is an organization’s ability to manage data velocity.  Recognizing that data velocity management is much more than a physical bandwidth and protocol issue, enterprises are implementing architectural solutions such as:

  • Operational data stores (ODSs) that periodically extract, integrate and re-organize production data for operational inquiry or tactical analysis
  • Caches that provide instant access to transaction data while buffering back-end systems from additional load and performance degradation. (Unlike ODSs, caches are updated according to adaptive business rules and have schemas that mimic the back-end source.)
  • Point-to-point (P2P) data routing between databases and applications (e.g. D2K, DataMirror) that circumvents high-latency hub-and-spoke models that are more appropriate for strategic analysis
  • Designing architectures that balance data latency with application data requirements and decision cycles, without assuming the entire information supply chain must be near real-time.

Data Variety

Through 2003/04, no greater barrier to effective data management will exist than the variety of incompatible data formats, non-aligned data structures, and inconsistent data semantics.  By this time, interchange and translation mechanisms will be built into most DBMSs. But until then, application portfolio sprawl (particularly when based on a “strategy” of autonomous software implementations due to e-commerce solution immaturity), increased partnerships, and M&A activity intensifies data variety challenges. Attempts to resolve data variety issues must be approached as an ongoing endeavor encompassing the following techniques:

  • Data profiling (e.g. Data Mentors, Metagenix) to discover hidden relationships and resolve inconsistencies across multiple data sources (see ADS898)
  • XML-based data format “universal translators” that import data into standard XML documents for export into another data format (e.g. infoShark, XML Solutions)
  • Enterprise application integration (EAI) predefined adapters (e.g. NEON, Tibco, Mercator) for acquiring and delivering data between known applications via message queues, or EAI development kits for building custom adapters.
  • Data access middleware (e.g. Information Builders’ EDA/SQL, SAS Access, OLE DB, ODBC) for direct connectivity between applications and databases
  • Distributed query management (DQM) software (e.g. Enth, InfoRay, Metagon) that adds a data routing and integration intelligence layer above “dumb” data access middleware
  • Metadata management solutions (i.e. repositories and schema standards) to capture and make available definitional metadata that can help provide contextual consistency to enterprise data
  • Advanced indexing techniques for relating (if not physically integrating) data of various incompatible types (e.g. multimedia, documents, structured data, business rules).

As with any sufficiently fashionable technology, users should expect the data management market place ebb-and-flow to yield solutions that consolidate multiple techniques and solutions that are increasingly application/environment specific. (See Figure 1 – Data Management Solutions) In selecting a technique or technology, enterprises should first perform an information audit assessing the status of their information supply chain to identify and prioritize particular data management issues.

Business Impact: Attention to data management, particularly in a climate of e-commerce and greater need for collaboration, can enable enterprises to achieve greater returns on their information assets.

Bottom Line: In 2001/02, IT organizations must look beyond traditional direct brute force physical approaches to data management.  Through 2003/04, practices for resolving e-commerce accelerated data volume, velocity and variety issues will become more formalized and diverse.  Increasingly, these techniques involve trade-offs and architectural solutions that involve and impact application portfolios and business strategy decisions.

###

Over the past decade, Gartner analysts including Regina Casonato, Anne Lapkin, Mark A. Beyer, Yvonne Genovese and Ted Friedman have continued to expand our research on this topic, identifying and refining other “big data” concepts. In September 2011 they published the tremendous research note Information Management in the 21st Century.  And in 2012, Mark Beyer and I developed and published Gartner’s updated definition of Big Data to reflect its value proposition and requirements for “new innovative forms of processing.” (See The Importance of ‘Big Data’: A Definition)

Doug Laney is a research vice president for Gartner Research, where he covers business analytics solutions and projects, information management, and data-governance-related issues. He is considered a pioneer in the field of data warehousing and created the first commercial project methodology for business intelligence/data warehouse projects. Mr. Laney is also originated the discipline of information economics (infonomics). 

Follow Doug on Twitter: @Doug_Laney

Read the original blog entry...

More Stories By Bob Gourley

Bob Gourley, former CTO of the Defense Intelligence Agency (DIA), is Founder and CTO of Crucial Point LLC, a technology research and advisory firm providing fact based technology reviews in support of venture capital, private equity and emerging technology firms. He has extensive industry experience in intelligence and security and was awarded an intelligence community meritorious achievement award by AFCEA in 2008, and has also been recognized as an Infoworld Top 25 CTO and as one of the most fascinating communicators in Government IT by GovFresh.

Cloud Expo Latest Stories
14th International Cloud Expo, held on June 10–12, 2014 at the Javits Center in New York City, featured three content-packed days with a rich array of sessions about the business and technical value of cloud computing, Internet of Things, Big Data, and DevOps led by exceptional speakers from every sector of the IT ecosystem. The Cloud Expo series is the fastest-growing Enterprise IT event in the past 10 years, devoted to every aspect of delivering massively scalable enterprise IT as a service.
As more applications and services move "to the cloud" (public or on-premise) cloud environments are increasingly adopting and building out traditional enterprise features. This in turn is enabling and encouraging cloud adoption from enterprise users. In many ways the definition is blurring as features like continuous operation, geo-distribution or on-demand capacity become the norm. NuoDB is involved in both building enterprise software and using enterprise cloud capabilities. In his session at 15th Cloud Expo, Seth Proctor, CTO at NuoDB, Inc., will discuss the experiences from building, deploying and using enterprise services and suggest some ways to approach moving enterprise applications into a cloud model.
Until recently, many organizations required specialized departments to perform mapping and geospatial analysis, and they used Esri on-premise solutions for that work. In his session at 15th Cloud Expo, Dave Peters, author of the Esri Press book Building a GIS, System Architecture Design Strategies for Managers, will discuss how Esri has successfully included the cloud as a fully integrated SaaS expansion of the ArcGIS mapping platform. Organizations that have incorporated Esri cloud-based applications and content within their business models are reaping huge benefits by directly leveraging cloud-based mapping and analysis capabilities within their existing enterprise investments. The ArcGIS mapping platform includes cloud-based content management and information resources to more widely, efficiently, and affordably deliver real-time actionable information and analysis capabilities to your organization.
In his session at 15th Cloud Expo, Mark Hinkle, Senior Director, Open Source Solutions at Citrix Systems Inc., will provide overview of the open source software that can be used to deploy and manage a cloud computing environment. He will include information on storage, networking(e.g., OpenDaylight) and compute virtualization (Xen, KVM, LXC) and the orchestration(Apache CloudStack, OpenStack) of the three to build their own cloud services. Speaker Bio: Mark Hinkle is the Senior Director, Open Source Solutions, at Citrix Systems Inc. He joined Citrix as a result of their July 2011 acquisition of Cloud.com where he was their Vice President of Community. He is currently responsible for Citrix open source efforts around the open source cloud computing platform, Apache CloudStack and the Xen Hypervisor. Previously he was the VP of Community at Zenoss Inc., a producer of the open source application, server, and network management software, where he grew the Zenoss Core project to over 10...
Almost everyone sees the potential of Internet of Things but how can businesses truly unlock that potential. The key will be in the ability to discover business insight in the midst of an ocean of Big Data generated from billions of embedded devices via Systems of Discover. Businesses will also need to ensure that they can sustain that insight by leveraging the cloud for global reach, scale and elasticity. In his session at Internet of @ThingsExpo, Mac Devine, Distinguished Engineer at IBM, will discuss bringing these three elements together via Systems of Discover.
Cloud and Big Data present unique dilemmas: embracing the benefits of these new technologies while maintaining the security of your organization’s assets. When an outside party owns, controls and manages your infrastructure and computational resources, how can you be assured that sensitive data remains private and secure? How do you best protect data in mixed use cloud and big data infrastructure sets? Can you still satisfy the full range of reporting, compliance and regulatory requirements? In his session at 15th Cloud Expo, Derek Tumulak, Vice President of Product Management at Vormetric, will discuss how to address data security in cloud and Big Data environments so that your organization isn’t next week’s data breach headline.
The cloud is everywhere and growing, and with it SaaS has become an accepted means for software delivery. SaaS is more than just a technology, it is a thriving business model estimated to be worth around $53 billion dollars by 2015, according to IDC. The question is – how do you build and scale a profitable SaaS business model? In his session at 15th Cloud Expo, Jason Cumberland, Vice President, SaaS Solutions at Dimension Data, will give the audience an understanding of common mistakes businesses make when transitioning to SaaS; how to avoid them; and how to build a profitable and scalable SaaS business.
SYS-CON Events announced today that Gridstore™, the leader in software-defined storage (SDS) purpose-built for Windows Servers and Hyper-V, will exhibit at SYS-CON's 15th International Cloud Expo®, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. Gridstore™ is the leader in software-defined storage purpose built for virtualization that is designed to accelerate applications in virtualized environments. Using its patented Server-Side Virtual Controller™ Technology (SVCT) to eliminate the I/O blender effect and accelerate applications Gridstore delivers vmOptimized™ Storage that self-optimizes to each application or VM across both virtual and physical environments. Leveraging a grid architecture, Gridstore delivers the first end-to-end storage QoS to ensure the most important App or VM performance is never compromised. The storage grid, that uses Gridstore’s performance optimized nodes or capacity optimized nodes, starts with as few a...
SYS-CON Events announced today that Solgenia, the global market leader in Cloud Collaboration and Cloud Infrastructure software solutions, will exhibit at SYS-CON's 15th International Cloud Expo®, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. Solgenia is the global market leader in Cloud Collaboration and Cloud Infrastructure software solutions. Designed to “Bridge the Gap” between personal and professional social, mobile and cloud user experiences, our solutions help large and medium-sized organizations dramatically improve productivity, reduce collaboration costs, and increase the overall enterprise value by bringing collaboration and infrastructure solutions to the cloud.
Cloud computing started a technology revolution; now DevOps is driving that revolution forward. By enabling new approaches to service delivery, cloud and DevOps together are delivering even greater speed, agility, and efficiency. No wonder leading innovators are adopting DevOps and cloud together! In his session at DevOps Summit, Andi Mann, Vice President of Strategic Solutions at CA Technologies, will explore the synergies in these two approaches, with practical tips, techniques, research data, war stories, case studies, and recommendations.
Enterprises require the performance, agility and on-demand access of the public cloud, and the management, security and compatibility of the private cloud. The solution? In his session at 15th Cloud Expo, Simone Brunozzi, VP and Chief Technologist(global role) for VMware, will explore how to unlock the power of the hybrid cloud and the steps to get there. He'll discuss the challenges that conventional approaches to both public and private cloud computing, and outline the tough decisions that must be made to accelerate the journey to the hybrid cloud. As part of the transition, an Infrastructure-as-a-Service model will enable enterprise IT to build services beyond their data center while owning what gets moved, when to move it, and for how long. IT can then move forward on what matters most to the organization that it supports – availability, agility and efficiency.
Every healthy ecosystem is diverse. This is especially true in cloud ecosystems, where portability and interoperability are more important than old enterprise models of proprietary ownership. In his session at 15th Cloud Expo, Mark Baker, Server Product Manager at Canonical/Ubuntu, will discuss how single vendors used to take the lead in creating and delivering technology, but in a cloud economy, where users want tools of their preference, when and where they need them, it makes no sense.
The 15th International Cloud Expo has just expanded its conference program, to bring together Cloud Computing, APM, APIs, Security, Big Data, Internet of Things, DevOps and WebRTC at one location. Cloud Expo is the single show where delegates and technology vendors can meet to experience and discuss the entire world of the cloud. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to learn about the latest technology developments and solutions.
SYS-CON Events announced today that Bsquare Corporation, a leading enabler of smart connected systems, has been named “Bronze Sponsor” of SYS-CON's Internet of @ThingsExpo, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. Bsquare is a global leader of embedded software solutions. We enable smart connected systems at the device level and beyond that millions use every day and provide actionable data solutions for the growing Internet of Things (IoT) market. We empower our world-class customers with our products, services and solutions to achieve innovation and success.
SYS-CON Events announced today that NuoDB, Inc., the leader in webscale distributed database technology, has been named “Bronze Sponsor” of SYS-CON's 15th International Cloud Expo®, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. NuoDB was launched in 2010 by industry-renowned database architect Jim Starkey and accomplished software CEO Barry Morris to deliver a webscale distributed database management system that is specifically designed for the cloud and the modern datacenter.