Click here to close now.

Welcome!

Big Data Journal Authors: Pat Romanski, Elizabeth White, Liz McMillan, Carmen Gonzalez, Blue Box Blog

Related Topics: Virtualization, Java, Microservices Journal, Big Data Journal, SDN Journal

Virtualization: Article

Take Big Advantage of Your Data

A fresh look at data virtualization

Last July, I wrote Data Virtualization Q&A: What's It All About, an ambitious article that attempted to address the topic of data virtualization from numerous angles including use cases, business benefits, and technology.

Since then, with the continued rapid expansion of big data and analytics, as well as data virtualization technology advances, my 360 degree view of data virtualization has evolved.

Data Rich, Information Poor
As I think about data virtualization today, the big data and analytics challenge that data virtualization best addresses is helping enterprises take advantage of their data.

In other words, enterprises today are data rich with loads of enterprise, cloud, third party and Big Data.  But they remain information poor.

In this context, let's consider the role of data virtualization with ten, back-to-the-basics questions and answers.

What is Data Virtualization?
Data virtualization is an agile data integration approach organizations use to gain more insight from their data.

Unlike data consolidation or data replication, data virtualization integrates diverse data without costly extra copies and additional data management complexity.

With data virtualization, you respond faster to ever changing analytics and BI needs, fast-track your data management evolution and save 50-75% over data replication and consolidation.

Why Use Data Virtualization?
With so much data today, the difference between business leaders and also-rans is often how well they leverage their data. Significant leverage equals significant business value, and that's a big advantage over the competition.

Data virtualization provides instant access to all the data you want, the way you want it.

Enterprise, cloud, Big Data, and more, no problem!

What Are the Benefits of Data Virtualization?
With data virtualization, you benefit in several important ways.

  • Gain more business insights by leveraging all your data - Empower your people with instant access to all the data they want, the way they want it.
  • Respond faster to your ever changing analytics and BI needs - Five to ten times faster time to solution than traditional data integration.
  • Fast-track your data management evolution - Start quickly and scale successfully with an easy-to-adopt overlay to existing infrastructure.
  • Save 50-75% over data replication and consolidation - Data virtualization's streamlined approach reduces complexity and saves money.

Who Uses Data Virtualization?
Data virtualization is used by your business and IT organizations.

  • Business Leaders - Data virtualization helps you drive business advantage from your data.
  • Information Consumers - From spreadsheet user to data scientist, data virtualization provides instant access to all the data you want, the way you want it.
  • CIOs and IT Leaders - Data virtualization's agile integration approach lets you respond faster to ever changing analytics and BI needs and do it for less.
  • CIOs and Architects - Data virtualization adds data integration flexibility so you can successfully evolve your data management strategy and architecture.
  • Integration Developers - Easy to learn and highly productive to use, data virtualization lets you deliver more business value sooner.

How Does Data Virtualization Work?
Data virtualization's business views provide instant access to the data your business users require, while shielding them from IT's complexity.

  • Develop - Your IT staff uses data virtualization's rich data analysis, design and development tools to build the business views (also known as data services).
  • Run - When your business users run a report or refresh a dashboard, data virtualization's high-performance query engine accesses the data sources and delivers the exact information requested.
  • Manage - Data virtualizations management, monitoring, security and governance functions ensure security, reliability and scalability.

Data virtualization vendor products such as the Composite Data Virtualization Platform provide all these capabilities in a complete and unified offering.

When to Use Data Virtualization?
You can use data virtualization to enable a wide range of information solutions including:

When Not to Use Data Virtualization?
Data virtualization is not the answer to every data integration problem.  Sometimes data consolidation in a warehouse or mart, along with ETL or ELT is a better solution for a particular use case.  And sometimes a hybrid mix is the right answer.

You can use a Data Integration Strategy Decision Tool to help you decide when to use data virtualization, data consolidation or perhaps a hybrid combination.

What is the Business Case for Data Virtualization?
Data virtualization has a compelling business case. The following drivers make data virtualization a "must have" for any large organization today.

  • Profit Growth - Data virtualization delivers the information your organization requires to increase revenue and reduce costs.
  • Risk Reduction - Data virtualization's up-to-the-minute business insights help you manage business risk and reduce compliance penalties.  Plus data virtualization's rapid development and quick iterations lower your IT project risk.
  • Technology Optimization - Data virtualization improves utilization of existing server and storage investments. And with less storage required, hardware and governance savings are substantial.
  • Staff Productivity - Data virtualization's easy-to-use, high-productivity design and development environments improve your staff effectiveness and efficiency.
  • Time-to-Solution Acceleration - Your data virtualization projects are completed faster so business benefits are derived sooner. Lower project costs are an additional agility benefit.

How to Deploy Data Virtualization?
You can start your data virtualization adoption with specific projects that address immediate information needs.

You can also deploy data virtualization in a more enterprise-wide manner, with common semantics, shared objects and architecture, and an Integration Competency Center.

Which Vendor Should I Select?
If you are like most, you would prefer to go with data virtualization market leader.  But how do you define the market leader

Is it the one with the most mature product?  For example, one data virtualization vendor has spent a decade delivering nearly 400 man years of R&D, six million lines of code and millions of hours of operational deployment.

Is it the one with the most installations?  For example the same vendor is used by nearly two hundred of world's largest organizations

Is it the one with them most domain knowledge?  This same vendor's data virtualization thought leadership assets demonstrate the expertise they can bring to bear for you. These include:

Conclusion
With so many new opportunities from Big Data, analytics and more, today's challenge is how to take big advantage. This article suggests that data virtualization can be that path, and provides answers to key questions about data virtualization. The time is now.

More Stories By Robert Eve

Robert Eve is the EVP of Marketing at Composite Software, the data virtualization gold standard and co-author of Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility. Bob's experience includes executive level roles at leading enterprise software companies such as Mercury Interactive, PeopleSoft, and Oracle. Bob holds a Masters of Science from the Massachusetts Institute of Technology and a Bachelor of Science from the University of California at Berkeley.

@BigDataExpo Stories
With the arrival of the Big Data revolution, a data professional is expected to master a broad spectrum of complex domains including data processing, mathematics, programming languages, machine learning techniques, and business knowledge. While this mastery is undoubtedly important, this narrow focus on tool usage has divorced many from the imagination required to solve real-world problems. As the demand for analysis increases, the data science community must transform from tool experts to "data...
The 3rd International @ThingsExpo, co-located with the 16th International Cloud Expo – to be held June 9-11, 2015, at the Javits Center in New York City, NY – is now accepting Hackathon proposals. Hackathon sponsorship benefits include general brand exposure and increasing engagement with the developer ecosystem. At Cloud Expo 2014 Silicon Valley, IBM held the Bluemix Developer Playground on November 5 and ElasticBox held the DevOps Hackathon on November 6. Both events took place on the expo fl...
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @Things...
15th Cloud Expo, which took place Nov. 4-6, 2014, at the Santa Clara Convention Center in Santa Clara, CA, expanded the conference content of @ThingsExpo, Big Data Expo, and DevOps Summit to include two developer events. IBM held a Bluemix Developer Playground on November 5 and ElasticBox held a Hackathon on November 6. Both events took place on the expo floor. The Bluemix Developer Playground, for developers of all levels, highlighted the ease of use of Bluemix, its services and functionalit...
The truth is, today’s databases are anything but agile – they are effectively static repositories that are cumbersome to work with, difficult to change, and cannot keep pace with application demands. Performance suffers as a result, and it takes far longer than it should to deliver new features and capabilities needed to make your organization competitive. As your application and business needs change, data repositories and structures get outmoded rapidly, resulting in increased work for applica...
Enthusiasm for the Internet of Things has reached an all-time high. In 2013 alone, venture capitalists spent more than $1 billion dollars investing in the IoT space. With "smart" appliances and devices, IoT covers wearable smart devices, cloud services to hardware companies. Nest, a Google company, detects temperatures inside homes and automatically adjusts it by tracking its user's habit. These technologies are quickly developing and with it come challenges such as bridging infrastructure gaps,...
A new definition of Big Data & the practical applications of the defined components & associated technical architecture models This presentation introduces a new definition of Big Data, along with the practical applications of the defined components and associated technical architecture models. In his session at Big Data Expo, Tony Shan will start with looking into the concept of Big Data and tracing back the first definition by Doug Laney, and then he will dive deep into the description of 3V...
We’re no longer looking to the future for the IoT wave. It’s no longer a distant dream but a reality that has arrived. It’s now time to make sure the industry is in alignment to meet the IoT growing pains – cooperate and collaborate as well as innovate. In his session at @ThingsExpo, Jim Hunter, Chief Scientist & Technology Evangelist at Greenwave Systems, will examine the key ingredients to IoT success and identify solutions to challenges the industry is facing. The deep industry expertise be...
NuoDB just introduced the Swifts 2.1 Release. In this demo at 15th Cloud Expo, Seth Proctor, CTO of NuoDB, Inc., discussed why scaling databases in the cloud is challenging, why building your application on top of the infrastructure that is designed with this in mind makes a difference, and what you can do with NuoDB that simplifies your programming model, your operations model.
You use an agile process; your goal is to make your organization more agile. But what about your data infrastructure? The truth is, today's databases are anything but agile - they are effectively static repositories that are cumbersome to work with, difficult to change, and cannot keep pace with application demands. Performance suffers as a result, and it takes far longer than it should to deliver new features and capabilities needed to make your organization competitive. As your application an...
Hovhannes Avoyan, CEO of Monitis, Inc., a provider of on-demand systems management and monitoring software to 50,000 users spanning small businesses and Fortune 500 companies, has surpassed 1.5 million page views on the SYS-CON family of online magazines, which includes Cloud Computing Journal, DevOps Journal, Internet of Things Journal, and Big Data Journal. His home page at SYS-CON can be found at Montis.SYS-CON.com
SYS-CON Events announced today that Liaison Technologies, a leading provider of data management and integration cloud services and solutions, has been named "Silver Sponsor" of SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York, NY. Liaison Technologies is a recognized market leader in providing cloud-enabled data integration and data management solutions to break down complex information barriers, enabling enterprises to make sm...
The 17th International Cloud Expo has announced that its Call for Papers is open. 17th International Cloud Expo, to be held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, APM, APIs, Microservices, Security, Big Data, Internet of Things, DevOps and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding bu...
Due of the rise of Hadoop, many enterprises are now deploying their first small clusters of 10 to 20 servers. At this small scale, the complexity of operating the cluster looks and feels like general data center servers. It is not until the clusters scale, as they inevitably do, when the pain caused by the exponential complexity becomes apparent. We've seen this problem occur time and time again. In his session at Big Data Expo, Greg Bruno, Vice President of Engineering and co-founder of StackI...
As enterprises engage with Big Data technologies to develop applications needed to meet operational demands, new computation fabrics are continually being introduced. To leverage these new innovations, organizations are sacrificing market opportunities to gain expertise in learning new systems. In his session at Big Data Expo, Supreet Oberoi, Vice President of Field Engineering at Concurrent, Inc., discussed how to leverage existing infrastructure and investments and future-proof them against e...
Hadoop as a Service (as offered by handful of niche vendors now) is a cloud computing solution that makes medium and large-scale data processing accessible, easy, fast and inexpensive. In his session at Big Data Expo, Kumar Ramamurthy, Vice President and Chief Technologist, EIM & Big Data, at Virtusa, will discuss how this is achieved by eliminating the operational challenges of running Hadoop, so one can focus on business growth. The fragmented Hadoop distribution world and various PaaS soluti...
Cryptography has become one of the most underappreciated, misunderstood components of technology. It’s too easy for salespeople to dismiss concerns with three letters that nobody wants to question. ‘Yes, of course, we use AES.’ But what exactly are you trusting to be the ultimate guardian of your data? Let’s face it – you probably don’t know. An organic, grass-fed Kobe steak is a far cry from a Big Mac, but they’re both beef, right? Not exactly. Crypto is the same way. The US government require...
With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo in Silicon Valley. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be! Internet of @ThingsExpo, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 17th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading in...
All major researchers estimate there will be tens of billions devices - computers, smartphones, tablets, and sensors - connected to the Internet by 2020. This number will continue to grow at a rapid pace for the next several decades. With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo, June 9-11, 2015, at the Javits Center in New York City. Learn what is going on, contribute to the discussions, and ensure that your enter...
The true value of the Internet of Things (IoT) lies not just in the data, but through the services that protect the data, perform the analysis and present findings in a usable way. With many IoT elements rooted in traditional IT components, Big Data and IoT isn’t just a play for enterprise. In fact, the IoT presents SMBs with the prospect of launching entirely new activities and exploring innovative areas. CompTIA research identifies several areas where IoT is expected to have the greatest impac...