Welcome!

Big Data Journal Authors: Liz McMillan, Kevin Benedict, Andreas Grabner, Pat Romanski, Elizabeth White

Blog Feed Post

Deja VVVu: Others Claiming Gartner’s Construct for Big Data

By

This article originally appeared on the Gartner Blog Network in January 2012 and is reprinted here with permission from Gartner and its author Doug Laney

In the late 1990s, while a META Group analyst (Note: META is now part of Gartner), it was becoming evident that our clients increasingly were encumbered by their data assets.  While many pundits were talking about, many clients were lamenting, and many vendors were seizing the opportunity of these fast-growing data stores, I also realized that something else was going on. Sea changes in the speed at which data was flowing mainly due to electronic commerce, along with the increasing breadth of data sources, structures and formats due to the post Y2K-ERP application boom were as or more challenging to data management teams than was the increasing quantity of data.

In an attempt to help our clients get a handle on how to recognize, and more importantly, deal with these challenges I began first speaking at industry conferences on this 3-dimensional data challenge of increasing data volume, velocity and variety.  Then in late 2000 I drafted a research note published in February 2001 entitled 3-D Data Management: Controlling Data Volume, Velocity and Variety.

Fast forward to today:  The “3V’s” framework for understanding and dealing with Big Data has now become ubiquitous.  In fact, other research firms, major vendors and consulting firms have even posited the 3Vs (or an unmistakable variant) as their own concept.  Since the original piece is no longer available in Gartner archives but is in increasing demand, I wanted to make it available here for anyone to reference and cite:

Original Research Note PDF: 3-D Data Management: Controlling Data Volume, Velocity and Variety

Date: 6 February 2001     Author: Doug Laney

3-D Data Management: Controlling Data Volume, Velocity and Variety. Current business conditions and mediums are pushing traditional data management principles to their limits, giving rise to novel and more formalized approaches.

META Trend: During 2001/02, leading enterprises will increasingly use a centralized data warehouse to define a common business vocabulary that improves internal and external collaboration. Through 2003/04, data quality and integration woes will be tempered by data profiling technologies (for generating metadata, consolidated schemas, and integration logic) and information logistics agents. By 2005/06, data, document, and knowledge management will coalesce, driven by schema-agnostic indexing strategies and portal maturity.

The effect of the e-commerce surge, a rise in merger & acquisition activity, increased collaboration, and the drive for harnessing information as a competitive catalyst is driving enterprises to higher levels of consciousness about how data is managed at its most basic level.  In 2001-02, historical, integrated databases (e.g. data warehouses, operational data stores, data marts), will be leveraged not only for intended analytical purposes, but increasingly for intra-enterprise consistency and coordination. By 2003-04, these structures (including their associated metadata) will be on par with application portfolios, organization charts and procedure manuals for defining a business to its employees and affiliates.

Data records, data structures, and definitions commonly accepted throughout an enterprise reduce fiefdoms pulling against each other due to differences in the way each perceives where the enterprise has been, is presently, and is headed.  Readily accessible current and historical records of transactions, affiliates (partners, employees, customers, suppliers), business processes (or rules), along with definitional and navigational metadata (see ADS Delta 896, 21st Century Metadata: Mapping the Enterprise Genome, 7 Aug 2000) enable employees to paddle in the same direction.  Conversely, application-specific data stores (e.g. accounts receivable versus order status), geographic-specific data stores (e.g. North American sales vs. International sales), offer conflicting, or insular views of the enterprise, that while important for feeding transactional systems, provide no “single version of the truth,” giving rise to inconsistency in the way enterprise factions function.

While enterprises struggle to consolidate systems and collapse redundant databases to enable greater operational, analytical, and collaborative consistencies, changing economic conditions have made this job more difficult.  E-commerce, in particular, has exploded data management challenges along three dimensions: volumes, velocity and variety.  In 2001/02, IT organizations must compile a variety of approaches to have at their disposal for dealing with each.

Data Volume

E-commerce channels increase the depth and breadth of data available about a transaction (or any point of interaction). The lower cost of e-channels enables and enterprise to offer its goods or services to more individuals or trading partners, and up to 10x the quantity of data about an individual transaction may be collected—thereby increasing the overall volume of data to be managed.  Furthermore, as enterprises come to see information as a tangible asset, they become reluctant to discard it.

Typically, increases in data volume are handled by purchasing additional online storage.  However as data volume increases, the relative value of each data point decreases proportionately—resulting in a poor financial justification for merely incrementing online storage. Viable alternates and supplements to hanging new disk include:

  • Implementing tiered storage systems (see SIS Delta 860, 19 Apr 2000) that cost effectively balance levels of data utility with data availability using a variety of media.
  • Limiting data collected to that which will be leveraged by current or imminent business processes
  • Limiting certain analytic structures to a percentage of statistically valid sample data.
  • Profiling data sources to identify and subsequently eliminate redundancies
  • Monitoring data usage to determine “cold spots” of unused data that can be eliminated or offloaded to tape (e.g. Ambeo, BEZ Systems, Teleran)
  • Outsourcing data management altogether (e.g. EDS, IBM)

Data Velocity

E-commerce has also increased point-of-interaction (POI) speed, and consequently the pace data used to support interactions and generated by interactions. As POI performance is increasingly perceived as a competitive differentiator (e.g. Web site response, inventory availability analysis, transaction execution, order tracking update, product/service delivery, etc.) so too is an organization’s ability to manage data velocity.  Recognizing that data velocity management is much more than a physical bandwidth and protocol issue, enterprises are implementing architectural solutions such as:

  • Operational data stores (ODSs) that periodically extract, integrate and re-organize production data for operational inquiry or tactical analysis
  • Caches that provide instant access to transaction data while buffering back-end systems from additional load and performance degradation. (Unlike ODSs, caches are updated according to adaptive business rules and have schemas that mimic the back-end source.)
  • Point-to-point (P2P) data routing between databases and applications (e.g. D2K, DataMirror) that circumvents high-latency hub-and-spoke models that are more appropriate for strategic analysis
  • Designing architectures that balance data latency with application data requirements and decision cycles, without assuming the entire information supply chain must be near real-time.

Data Variety

Through 2003/04, no greater barrier to effective data management will exist than the variety of incompatible data formats, non-aligned data structures, and inconsistent data semantics.  By this time, interchange and translation mechanisms will be built into most DBMSs. But until then, application portfolio sprawl (particularly when based on a “strategy” of autonomous software implementations due to e-commerce solution immaturity), increased partnerships, and M&A activity intensifies data variety challenges. Attempts to resolve data variety issues must be approached as an ongoing endeavor encompassing the following techniques:

  • Data profiling (e.g. Data Mentors, Metagenix) to discover hidden relationships and resolve inconsistencies across multiple data sources (see ADS898)
  • XML-based data format “universal translators” that import data into standard XML documents for export into another data format (e.g. infoShark, XML Solutions)
  • Enterprise application integration (EAI) predefined adapters (e.g. NEON, Tibco, Mercator) for acquiring and delivering data between known applications via message queues, or EAI development kits for building custom adapters.
  • Data access middleware (e.g. Information Builders’ EDA/SQL, SAS Access, OLE DB, ODBC) for direct connectivity between applications and databases
  • Distributed query management (DQM) software (e.g. Enth, InfoRay, Metagon) that adds a data routing and integration intelligence layer above “dumb” data access middleware
  • Metadata management solutions (i.e. repositories and schema standards) to capture and make available definitional metadata that can help provide contextual consistency to enterprise data
  • Advanced indexing techniques for relating (if not physically integrating) data of various incompatible types (e.g. multimedia, documents, structured data, business rules).

As with any sufficiently fashionable technology, users should expect the data management market place ebb-and-flow to yield solutions that consolidate multiple techniques and solutions that are increasingly application/environment specific. (See Figure 1 – Data Management Solutions) In selecting a technique or technology, enterprises should first perform an information audit assessing the status of their information supply chain to identify and prioritize particular data management issues.

Business Impact: Attention to data management, particularly in a climate of e-commerce and greater need for collaboration, can enable enterprises to achieve greater returns on their information assets.

Bottom Line: In 2001/02, IT organizations must look beyond traditional direct brute force physical approaches to data management.  Through 2003/04, practices for resolving e-commerce accelerated data volume, velocity and variety issues will become more formalized and diverse.  Increasingly, these techniques involve trade-offs and architectural solutions that involve and impact application portfolios and business strategy decisions.

###

Over the past decade, Gartner analysts including Regina Casonato, Anne Lapkin, Mark A. Beyer, Yvonne Genovese and Ted Friedman have continued to expand our research on this topic, identifying and refining other “big data” concepts. In September 2011 they published the tremendous research note Information Management in the 21st Century.  And in 2012, Mark Beyer and I developed and published Gartner’s updated definition of Big Data to reflect its value proposition and requirements for “new innovative forms of processing.” (See The Importance of ‘Big Data’: A Definition)

Doug Laney is a research vice president for Gartner Research, where he covers business analytics solutions and projects, information management, and data-governance-related issues. He is considered a pioneer in the field of data warehousing and created the first commercial project methodology for business intelligence/data warehouse projects. Mr. Laney is also originated the discipline of information economics (infonomics). 

Follow Doug on Twitter: @Doug_Laney

Read the original blog entry...

More Stories By Bob Gourley

Bob Gourley, former CTO of the Defense Intelligence Agency (DIA), is Founder and CTO of Crucial Point LLC, a technology research and advisory firm providing fact based technology reviews in support of venture capital, private equity and emerging technology firms. He has extensive industry experience in intelligence and security and was awarded an intelligence community meritorious achievement award by AFCEA in 2008, and has also been recognized as an Infoworld Top 25 CTO and as one of the most fascinating communicators in Government IT by GovFresh.

Cloud Expo Breaking News
Simply defined the SDDC promises that you’ll be able to treat “all” of your IT infrastructure as if it’s completely malleable. That there are no restrictions to how you can use and assign everything from border controls to VM size as long as you stay within the technical capabilities of the devices. The promise is great, but the reality is still a dream for the majority of enterprises. In his session at 14th Cloud Expo, Mark Thiele, EVP, Data Center Tech, at SUPERNAP, will cover where and how a business might benefit from SDDC and also why they should or shouldn’t attempt to adopt today.
MapDB is an Apache-licensed open source database specifically designed for Java developers. The library uses the standard Java Collections API, making it totally natural for Java developers to use and adopt, while scaling database size from GBs to TBs. MapDB is very fast and supports an agile approach to data, allowing developers to construct flexible schemas to exactly match application needs and tune performance, durability and caching for specific requirements.
APIs came about to help companies create and manage their digital ecosystem, enabling them not only to reach more customers through more devices, but also create a large supporting ecosystem of developers and partners. While Facebook, Twitter and Netflix were the early adopters of APIs, large enterprises have been quick to embrace the concept of APIs and have been leveraging APIs as a connective tissue that powers all interactions between their customers, partners and employees. As enterprises embrace APIs, some very specific Enterprise API Adoption patterns and best practices have started emerging. In his session at 14th Cloud Expo, Sachin Agarwal, VP of Product Marketing and Strategy at SOA Software, will talk about the most common enterprise API patterns and will discuss how enterprises can successfully launch an API program.
The social media expansion has shown just how people are eager to share their experiences with the rest of the world. Cloud technology is the perfect platform to satisfy this need given its great flexibility and readiness. At Cynny, we aim to revolutionize how people share and organize their digital life through a brand new cloud service, starting from infrastructure to the users’ interface. A revolution that began from inventing and designing our very own infrastructure: we have created the first server network powered solely by ARM CPU. The microservers have “organism-like” features, differentiating them from any of the current technologies. Benefits include low consumption of energy, making Cynny the ecologically friendly alternative for storage as well as cheaper infrastructure, lower running costs, etc.
Next-Gen Cloud. Whatever you call it, there’s a higher calling for cloud computing that requires providers to change their spots and move from a commodity mindset to a premium one. Businesses can no longer maintain the status quo that today’s service providers offer. Yes, the continuity, speed, mobility, data access and connectivity are staples of the cloud and always will be. But cloud providers that plan to not only exist tomorrow – but to lead – know that security must be the top priority for the cloud and are delivering it now. In his session at 14th Cloud Expo, Kurt Hagerman, Chief Information Security Officer at FireHost, will detail why and how you can have both infrastructure performance and enterprise-grade security – and what tomorrow's cloud provider will look like.
Today, developers and business units are leading the charge to cloud computing. The primary driver: faster access to computing resources by using the cloud's automated infrastructure provisioning. However, fast access to infrastructure exposes the next friction point: creating, delivering, and operating applications much faster. In his session at 14th Cloud Expo, Bernard Golden, VP of Strategy at ActiveState, will discuss why solving the next friction point is critical for true cloud computing success and how developers and business units can leverage service catalogs, frameworks, and DevOps to achieve the true goal of IT: delivering increased business value through applications.
Web conferencing in a public cloud has the same risks as any other cloud service. If you have ever had concerns over the types of data being shared in your employees’ web conferences, such as IP, financials or customer data, then it’s time to look at web conferencing in a private cloud. In her session at 14th Cloud Expo, Courtney Behrens, Senior Marketing Manager at Brother International, will discuss how issues that had previously been out of your control, like performance, advanced administration and compliance, can now be put back behind your firewall.
More and more enterprises today are doing business by opening up their data and applications through APIs. Though forward-thinking and strategic, exposing APIs also increases the surface area for potential attack by hackers. To benefit from APIs while staying secure, enterprises and security architects need to continue to develop a deep understanding about API security and how it differs from traditional web application security or mobile application security. In his session at 14th Cloud Expo, Sachin Agarwal, VP of Product Marketing and Strategy at SOA Software, will walk you through the various aspects of how an API could be potentially exploited. He will discuss the necessary best practices to secure your data and enterprise applications while continue continuing to support your business’s digital initiatives.
The revolution that happened in the server universe over the past 15 years has resulted in an eco-system that is more open, more democratically innovative and produced better results in technically challenging dimensions like scale. The underpinnings of the revolution were common hardware, standards based APIs (ex. POSIX) and a strict adherence to layering and isolation between applications, daemons and kernel drivers/modules which allowed multiple types of development happen in parallel without hindering others. Put simply, today's server model is built on a consistent x86 platform with few surprises in its core components. A kernel abstracts away the platform, so that applications and daemons are decoupled from the hardware. In contrast, networking equipment is still stuck in the mainframe era. Today, networking equipment is a single appliance, including hardware, OS, applications and user interface come as a monolithic entity from a single vendor. Switching between different vendor'...
Cloud backup and recovery services are critical to safeguarding an organization’s data and ensuring business continuity when technical failures and outages occur. With so many choices, how do you find the right provider for your specific needs? In his session at 14th Cloud Expo, Daniel Jacobson, Technology Manager at BUMI, will outline the key factors including backup configurations, proactive monitoring, data restoration, disaster recovery drills, security, compliance and data center resources. Aside from the technical considerations, the secret sauce in identifying the best vendor is the level of focus, expertise and specialization of their engineering team and support group, and how they monitor your day-to-day backups, provide recommendations, and guide you through restores when necessary.
Cloud scalability and performance should be at the heart of every successful Internet venture. The infrastructure needs to be resilient, flexible, and fast – it’s best not to get caught thinking about architecture until the middle of an emergency, when it's too late. In his interactive, no-holds-barred session at 14th Cloud Expo, Phil Jackson, Development Community Advocate for SoftLayer, will dive into how to design and build-out the right cloud infrastructure.
You use an agile process; your goal is to make your organization more agile. What about your data infrastructure? The truth is, today’s databases are anything but agile – they are effectively static repositories that are cumbersome to work with, difficult to change, and cannot keep pace with application demands. Performance suffers as a result, and it takes far longer than it should to deliver on new features and capabilities needed to make your organization competitive. As your application and business needs change, data repositories and structures get outmoded rapidly, resulting in increased work for application developers and slow performance for end users. Further, as data sizes grow into the Big Data realm, this problem is exacerbated and becomes even more difficult to address. A seemingly simple schema change can take hours (or more) to perform, and as requirements evolve the disconnect between existing data structures and actual needs diverge.
SYS-CON Events announced today that SherWeb, a long-time leading provider of cloud services and Microsoft's 2013 World Hosting Partner of the Year, will exhibit at SYS-CON's 14th International Cloud Expo®, which will take place on June 10–12, 2014, at the Javits Center in New York City, New York. A worldwide hosted services leader ranking in the prestigious North American Deloitte Technology Fast 500TM, and Microsoft's 2013 World Hosting Partner of the Year, SherWeb provides competitive cloud solutions to businesses and partners around the world. Founded in 1998, SherWeb is a privately owned company headquartered in Quebec, Canada. Its service portfolio includes Microsoft Exchange, SharePoint, Lync, Dynamics CRM and more.
The world of cloud and application development is not just for the hardened developer these days. In their session at 14th Cloud Expo, Phil Jackson, Development Community Advocate for SoftLayer, and Harold Hannon, Sr. Software Architect at SoftLayer, will pull back the curtain of the architecture of a fun demo application purpose-built for the cloud. They will focus on demonstrating how they leveraged compute, storage, messaging, and other cloud elements hosted at SoftLayer to lower the effort and difficulty of putting together a useful application. This will be an active demonstration and review of simple command-line tools and resources, so don’t be afraid if you are not a seasoned developer.
SYS-CON Events announced today that BUMI, a premium managed service provider specializing in data backup and recovery, will exhibit at SYS-CON's 14th International Cloud Expo®, which will take place on June 10–12, 2014, at the Javits Center in New York City, New York. Manhattan-based BUMI (Backup My Info!) is a premium managed service provider specializing in data backup and recovery. Founded in 2002, the company’s Here, There and Everywhere data backup and recovery solutions are utilized by more than 500 businesses. BUMI clients include professional service organizations such as banking, financial, insurance, accounting, hedge funds and law firms. The company is known for its relentless passion for customer service and support, and has won numerous awards, including Customer Service Provider of the Year and 10 Best Companies to Work For.