|By Robert Eve||
|February 15, 2013 08:00 AM EST||
Last July, I wrote Data Virtualization Q&A: What's It All About, an ambitious article that attempted to address the topic of data virtualization from numerous angles including use cases, business benefits, and technology.
Since then, with the continued rapid expansion of big data and analytics, as well as data virtualization technology advances, my 360 degree view of data virtualization has evolved.
Data Rich, Information Poor
As I think about data virtualization today, the big data and analytics challenge that data virtualization best addresses is helping enterprises take advantage of their data.
In other words, enterprises today are data rich with loads of enterprise, cloud, third party and Big Data. But they remain information poor.
In this context, let's consider the role of data virtualization with ten, back-to-the-basics questions and answers.
What is Data Virtualization?
Data virtualization is an agile data integration approach organizations use to gain more insight from their data.
Unlike data consolidation or data replication, data virtualization integrates diverse data without costly extra copies and additional data management complexity.
With data virtualization, you respond faster to ever changing analytics and BI needs, fast-track your data management evolution and save 50-75% over data replication and consolidation.
Why Use Data Virtualization?
With so much data today, the difference between business leaders and also-rans is often how well they leverage their data. Significant leverage equals significant business value, and that's a big advantage over the competition.
Data virtualization provides instant access to all the data you want, the way you want it.
Enterprise, cloud, Big Data, and more, no problem!
What Are the Benefits of Data Virtualization?
With data virtualization, you benefit in several important ways.
- Gain more business insights by leveraging all your data - Empower your people with instant access to all the data they want, the way they want it.
- Respond faster to your ever changing analytics and BI needs - Five to ten times faster time to solution than traditional data integration.
- Fast-track your data management evolution - Start quickly and scale successfully with an easy-to-adopt overlay to existing infrastructure.
- Save 50-75% over data replication and consolidation - Data virtualization's streamlined approach reduces complexity and saves money.
Who Uses Data Virtualization?
Data virtualization is used by your business and IT organizations.
- Business Leaders - Data virtualization helps you drive business advantage from your data.
- Information Consumers - From spreadsheet user to data scientist, data virtualization provides instant access to all the data you want, the way you want it.
- CIOs and IT Leaders - Data virtualization's agile integration approach lets you respond faster to ever changing analytics and BI needs and do it for less.
- CIOs and Architects - Data virtualization adds data integration flexibility so you can successfully evolve your data management strategy and architecture.
- Integration Developers - Easy to learn and highly productive to use, data virtualization lets you deliver more business value sooner.
How Does Data Virtualization Work?
Data virtualization's business views provide instant access to the data your business users require, while shielding them from IT's complexity.
- Develop - Your IT staff uses data virtualization's rich data analysis, design and development tools to build the business views (also known as data services).
- Run - When your business users run a report or refresh a dashboard, data virtualization's high-performance query engine accesses the data sources and delivers the exact information requested.
- Manage - Data virtualizations management, monitoring, security and governance functions ensure security, reliability and scalability.
Data virtualization vendor products such as the Composite Data Virtualization Platform provide all these capabilities in a complete and unified offering.
When to Use Data Virtualization?
You can use data virtualization to enable a wide range of information solutions including:
- Agile Analytics and BI Solutions
- Data Warehouse Extension Solutions
- Logical Data Warehouse Solutions
- Data Virtualization Architecture Solutions
- Data Integration and Management Solutions
- Business Solutions
- Industry Solutions
When Not to Use Data Virtualization?
Data virtualization is not the answer to every data integration problem. Sometimes data consolidation in a warehouse or mart, along with ETL or ELT is a better solution for a particular use case. And sometimes a hybrid mix is the right answer.
You can use a Data Integration Strategy Decision Tool to help you decide when to use data virtualization, data consolidation or perhaps a hybrid combination.
What is the Business Case for Data Virtualization?
Data virtualization has a compelling business case. The following drivers make data virtualization a "must have" for any large organization today.
- Profit Growth - Data virtualization delivers the information your organization requires to increase revenue and reduce costs.
- Risk Reduction - Data virtualization's up-to-the-minute business insights help you manage business risk and reduce compliance penalties. Plus data virtualization's rapid development and quick iterations lower your IT project risk.
- Technology Optimization - Data virtualization improves utilization of existing server and storage investments. And with less storage required, hardware and governance savings are substantial.
- Staff Productivity - Data virtualization's easy-to-use, high-productivity design and development environments improve your staff effectiveness and efficiency.
- Time-to-Solution Acceleration - Your data virtualization projects are completed faster so business benefits are derived sooner. Lower project costs are an additional agility benefit.
How to Deploy Data Virtualization?
You can start your data virtualization adoption with specific projects that address immediate information needs.
Which Vendor Should I Select?
If you are like most, you would prefer to go with data virtualization market leader. But how do you define the market leader
Is it the one with the most mature product? For example, one data virtualization vendor has spent a decade delivering nearly 400 man years of R&D, six million lines of code and millions of hours of operational deployment.
Is it the one with the most installations? For example the same vendor is used by nearly two hundred of world's largest organizations
Is it the one with them most domain knowledge? This same vendor's data virtualization thought leadership assets demonstrate the expertise they can bring to bear for you. These include:
- The first book on data virtualization, Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility.
- Data virtualization's foremost microsite, the DV Café
- The Data Virtualization Leadership Series of analyst reports on data virtualization
- Data virtualization's only dedicated blog, the Data Virtualization Leadership Blog
- The Data Virtualization Channel on YouTube with users, analysts, chalk talks and more
- The Data Virtualization Leadership Awards honoring users
- Data Virtualization Day Resources, assets from the premier events in data virtualization
- Data virtualization's longest running newsletter, Enterprise Information Insight
With so many new opportunities from Big Data, analytics and more, today's challenge is how to take big advantage. This article suggests that data virtualization can be that path, and provides answers to key questions about data virtualization. The time is now.
SimpleECM is the only platform to offer a powerful combination of enterprise content management (ECM) services, capture solutions, and third-party business services providing simplified integrations and workflow development for solution providers. SimpleECM is opening the market to businesses of all sizes by reinventing the delivery of ECM services. Our APIs make the development of ECM services simple with the use of familiar technologies for a frictionless integration directly into web applicat...
Oct. 19, 2014 11:00 PM EDT Reads: 1,120
The 3rd International Internet of @ThingsExpo, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that its Call for Papers is now open. The Internet of Things (IoT) is the biggest idea since the creation of the Worldwide Web more than 20 years ago.
Oct. 19, 2014 10:45 PM EDT Reads: 1,820
The Industrial Internet revolution is now underway, enabled by connected machines and billions of devices that communicate and collaborate. The massive amounts of Big Data requiring real-time analysis is flooding legacy IT systems and giving way to cloud environments that can handle the unpredictable workloads. Yet many barriers remain until we can fully realize the opportunities and benefits from the convergence of machines and devices with Big Data and the cloud, including interoperability, da...
Oct. 19, 2014 10:00 PM EDT Reads: 1,211
All major researchers estimate there will be tens of billions devices - computers, smartphones, tablets, and sensors - connected to the Internet by 2020. This number will continue to grow at a rapid pace for the next several decades. Over the summer Gartner released its much anticipated annual Hype Cycle report and the big news is that Internet of Things has now replaced Big Data as the most hyped technology. Indeed, we're hearing more and more about this fascinating new technological paradigm. ...
Oct. 19, 2014 09:00 PM EDT Reads: 1,343
Cultural, regulatory, environmental, political and economic (CREPE) conditions over the past decade are creating cross-industry solution spaces that require processes and technologies from both the Internet of Things (IoT), and Data Management and Analytics (DMA). These solution spaces are evolving into Sensor Analytics Ecosystems (SAE) that represent significant new opportunities for organizations of all types. Public Utilities throughout the world, providing electricity, natural gas and water,...
Oct. 19, 2014 07:30 PM EDT Reads: 1,170
Software AG helps organizations transform into Digital Enterprises, so they can differentiate from competitors and better engage customers, partners and employees. Using the Software AG Suite, companies can close the gap between business and IT to create digital systems of differentiation that drive front-line agility. We offer four on-ramps to the Digital Enterprise: alignment through collaborative process analysis; transformation through portfolio management; agility through process automation...
Oct. 19, 2014 04:00 PM EDT Reads: 1,180
The Internet of Things needs an entirely new security model, or does it? Can we save some old and tested controls for the latest emerging and different technology environments? In his session at Internet of @ThingsExpo, Davi Ottenheimer, EMC Senior Director of Trust, will review hands-on lessons with IoT devices and reveal privacy options and a new risk balance you might not expect.
Oct. 19, 2014 11:00 AM EDT Reads: 1,789
The information technology sphere undergoes what we like to call a paradigm shift, sea change or plain old ‘upheaval’ roughly every five years or so. Don’t ask anybody why this half decade cyclicality exists; it just has to be so. Accept that reinvention happens constantly and that major seismic shifts are tangibly felt by us human beings roughly every 1826.21 days… and we can move on.
Oct. 18, 2014 11:00 PM EDT Reads: 1,392
There’s Big Data, then there’s really Big Data from the Internet of Things. IoT is evolving to include many data possibilities like new types of event, log and network data. The volumes are enormous, generating tens of billions of logs per day, which raise data challenges. Early IoT deployments are relying heavily on both the cloud and managed service providers to navigate these challenges. In her session at 6th Big Data Expo®, Hannah Smalltree, Director at Treasure Data, to discuss how IoT, B...
Oct. 18, 2014 05:00 PM EDT Reads: 1,827
Cloudwick, the leading big data DevOps service and solution provider to the Fortune 1000, announced Big Loop, its multi-vendor operations platform. Cloudwick Big Loop creates greater collaboration between Fortune 1000 IT staff, developers and their database management systems as well as big data vendors. This allows customers to comprehensively manage and oversee their entire infrastructure, which leads to more successful production cluster operations, and scale-out. Cloudwick Big Loop supports ...
Oct. 18, 2014 04:00 PM EDT Reads: 1,377
SYS-CON Events announced today that Objectivity, Inc., the leader in real-time, complex Big Data solutions, will exhibit at SYS-CON's 15th International Cloud Expo®, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. Objectivity, Inc. is the Enterprise Database leader of real-time, complex Big Data solutions. Our leading edge technologies – InfiniteGraph®, The Distributed Graph Database™ and Objectivity/DB®, a distributed and scalable object ma...
Oct. 18, 2014 12:00 PM EDT Reads: 1,653
In their session at DevOps Summit, Stan Klimoff, CTO of Qubell, and Mike Becker, Senior Data Engineer for RingCentral, will share the lessons learned from implementing CI/CD pipeline on AWS for a customer analytics project powered by Cloudera Hadoop, HP Vertica and Tableau. Stan Klimoff is CTO of Qubell, the enterprise DevOps platform. Stan has more than a decade of experience building distributed systems for companies such as eBay, Cisco and Seagate. Qubell is helping enterprises to become mor...
Oct. 17, 2014 08:00 PM EDT Reads: 1,534
The major cloud platforms defy a simple, side-by-side analysis. Each of the major IaaS public-cloud platforms offers their own unique strengths and functionality. Options for on-site private cloud are diverse as well, and must be designed and deployed while taking existing legacy architecture and infrastructure into account. Then the reality is that most enterprises are embarking on a hybrid cloud strategy and programs. In this Power Panel at 15th Cloud Expo, moderated by Ashar Baig, Research ...
Oct. 17, 2014 07:45 PM EDT Reads: 1,521
The evolution of the database is under constant upheaval, discussion, debate and (if you will excuse the expression) 'analysis.' This basic truth is now more relevant, pertinent and pressing than ever due to the prevalence of Big Data (and the need to impose analytics of insight upon it) driven by social, mobile, cloud and of course the Internet of (Every) Things. Today then, as a staple of our IT infrastructure, databases have been around for over 50 years now with first references of the ter...
Oct. 17, 2014 03:00 PM EDT Reads: 1,577
Big Data means many things to many people. From November 4-6 at the Santa Clara Convention Center, thousands of people will gather at Big Data Expo to discuss what it means to them, how they are implementing it, and how Big Data plays an integral role in the maturing cloud computing world and emerging Internet of Things. Attend Big Data Expo and make your contribution. Register for Big Data Expo "FREE" with Discount Code "BigDataOCTOBER" by October 31
Oct. 17, 2014 03:00 PM EDT Reads: 1,619
SYS-CON Events announced today that Cloudian, Inc., the leading provider of hybrid cloud storage solutions, has been named “Bronze Sponsor” of SYS-CON's 15th International Cloud Expo®, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. Cloudian is a Foster City, Calif.-based software company specializing in cloud storage. Cloudian HyperStore® is an S3-compatible cloud object storage platform that enables service providers and enterprises to bui...
Oct. 17, 2014 09:00 AM EDT Reads: 1,555
What process has your provider undertaken to ensure that the cloud tenant will receive predictable performance and service? What was involved in the planning? Who owns and operates the data center? What technology is being used? How is it being supported? In his session at 14th Cloud Expo, Dave Weisbrot, Cloud Business Manager for QTS, will provide the attendees a look into what it takes to stand up and stand behind a highly available certified cloud IaaS.
Oct. 16, 2014 09:15 PM EDT Reads: 1,108
Things are being built upon cloud foundations to transform organizations. This CEO Power Panel at 15th Cloud Expo, moderated by Roger Strukhoff, Cloud Expo and @ThingsExpo conference chair, will address the big issues involving these technologies and, more important, the results they will achieve. How important are public, private, and hybrid cloud to the enterprise? How does one define Big Data? And how is the IoT tying all this together?
Oct. 16, 2014 08:45 PM EDT Reads: 1,376
Data efficiency – the combination of technologies including data deduplication, compression, zero elimination and thin provisioning – transformed the backup storage appliance market in well under a decade. Why has it taken so long for the same changes to occur in the primary storage appliance market? The answer can be found by looking back at the early evolution of the backup appliance market, and understanding why EMC’s Data Domain continues to hold a commanding lead in that market today. The ...
Oct. 16, 2014 05:00 PM EDT Reads: 1,934
Target. Home Depot. Community Health Systems. Nieman Marcus. Their names have been all in the news over the past year, though probably not in a way they would like. All have had very public data breaches affecting anywhere from 350,000 (Nieman Marcus) to 4.5 million (Community Health Systems) customers. Add the recent high-profile celebrity nude photo hacking scandal and cloud security has become the trending topic in all the news and social media. Some of the discussions reminded me of a line f...
Oct. 16, 2014 02:30 PM EDT Reads: 2,088