Click here to close now.

Welcome!

Big Data Journal Authors: Amit Gupta, Carmen Gonzalez, Liz McMillan, Lori MacVittie, Brett Hofer

News Feed Item

Market Challenges for Big Data Solutions Providers

NEW YORK, Feb. 4, 2014 /PRNewswire/ -- Reportlinker.com announces that a new market research report is available in its catalogue:

Market Challenges for Big Data Solutions Providers
http://www.reportlinker.com/p02005754/Market-Challenges-for-Big-Data-Solutions-Providers.html#utm_source=prnewswire&utm_medium=pr&utm_campaign=Open_Source_and_Free_Software

On the brink of any new high tech era, the temptation is strong for solution providers to hype the benefits and minimize the disruption that such major shifts can produce. In these early days, one of the most important missing pieces in the Big Data puzzle has to do with accurately articulating both the potential payoffs and the attendant challenges. In this paper, Frost & Sullivan ventures some observations about these communication challenges. To illustrate these observations, we take a look at how Oracle is approaching the Big Data market, how Netflix defected from Oracle to Amazon Web Services (AWS), and how Salesforce.com opted to combine Oracle solutions with those of newer solution providers.

Introduction

Once upon a time, databases were static information repositories, housed on expensive mainframe computers, updated in planned and consistent cycles, and accessible only to those who could mount the drives, punch the cards and paper tapes, and decipher the cryptic output. Fast forward fifty years, and Excel lets everyone enter and retrieve whatever data they can gather, while organizations seem to have as many databases as they have departments, projects and accounting auditors.

Data storage options also have evolved, from basement vaults and offsite bunkers full of tapes and disc arrays, to data warehouses, mirrored systems, storage area networks, and instances in service provider network clouds.

Advances in computing power and network capacity have greatly expanded our ability to collect, store, analyze and transport the data needed to support our traditional social and economic activities; while at the same time they are stimulating the generation of vast new data stores. We generate this new data every time we use our smart phones and tablets to "go online" and check the weather, update our calendars, order a pizza, or download a podcast.
Companies like Amazon and Google pioneered the concept of Big Data—basically by using commodity servers, copious bandwidth and open sourced software platforms as substitutes for what was then the gold standard for data management: complex, special-purpose hardware and software running over carefully engineered networks. These early adopters discovered that their new environments were not only more cost-effective but also more scalable than traditional data processing, networking and storage solutions. While Big Data applications used by these online pioneers and by the scientific community (Big Science) have been well underway for a decade or more, successful implementations in more mainstream market segments are just beginning to accumulate.

There is a simple reason why so many different types of vendors have a stake in the multi-billion dollar market for Big Data solutions. It's the same reason why so many different types of organizations are trying to figure out how they might put Big Data to work in their own environments and on their own applications. Everyone senses that Big Data could be the next crucial and pervasive information technology, as transformative and necessary as its computing and networking antecedents have proven to be over the past fifty years.

On the brink of any new high tech era, the temptation is strong for solution providers to hype the benefits and minimize the disruption that such major shifts can produce. In these early days, one of the most important missing pieces in the Big Data puzzle has to do with accurately articulating both the potential payoffs and the attendant challenges. In this paper, Frost & Sullivan ventures some observations about these communication challenges. To illustrate these observations, we take a look at how Oracle is approaching the Big Data market, how Netflix defected from Oracle to Amazon Web Services (AWS), and how Salesforce.com opted to combine Oracle solutions with those of newer solution providers.

Oracle: How One Big Vendor Does Big Data

In a recent LinkedIn article posted by Mark Hurd, late of HP and now leading Oracle, Hurd mischaracterizes Big Data as one of several "modern business challenges," including mobile, social, real-time visibility and decision-making, and deeper and longer-lasting customer engagements and experiences. Although Big Data initiatives have already begun to address all the other items on Hurd's list, his solution is what he calls "truly modern IT systems," and he broadly dismisses legacy systems as "stuff in the basement that creates little or no value."

Perhaps it's a bit unfair to fault Hurd for using the kind of marketing message so common in the high-tech industry, which is often broadly criticized for over-simplification and an exaggerated sense of urgency. And it isn't a surprise that Hurd is again dissing legacy IT installations; after all, he made part of his reputation at HP by inverting the IT truism that legacy systems eat up % of most IT budgets, leaving only % for innovation and growth initiatives. However, when he claims that, by spending less than that % on these "truly modern IT systems," organizations could "liberate precious IT dollars from low-value or no-value infrastructure and integration" and "not only help drive customer-focused innovation, but also help to cut overall IT spending at the same time"—well, here we have a great example of the need to better articulate Big Data's promises and challenges.

Big traditional IT solution providers, like Oracle, who are pitching Big Data also may want to soft-pedal the urgency aspect for a while. That's because the "stuff in the basement that creates little or no value" is still a primary revenue generator for these vendors, who collect stiff fees for ongoing licensing and support. It's also the functional IT backbone of the installed base of customers who pay those fees. Neither these vendors nor their customers are really in a position to simply yank these systems out and start over with a clean sheet of paper.

Table Of Contents

1 | MARKETING CHALLENGES FOR BIG DATA SOLUTION PROVIDERS

BDA 2-01
1. Introduction
2. Oracle: How One Big Vendor Does Big Data
3. Netflix: New Solutions for New Companies with New Tactics
4. Salesforce.com: Having It Both Ways
5. Frost & Sullivan - The Last Word
6. About Frost & Sullivan

To order this report: Market Challenges for Big Data Solutions Providers
http://www.reportlinker.com/p02005754/Market-Challenges-for-Big-Data-Solutions-Providers.html#utm_source=prnewswire&utm_medium=pr&utm_campaign=Open_Source_and_Free_Software

__________________________
Contact Clare: [email protected]
US: (339)-368-6001
Intl: +1 339-368-6001

SOURCE Reportlinker

More Stories By PR Newswire

Copyright © 2007 PR Newswire. All rights reserved. Republication or redistribution of PRNewswire content is expressly prohibited without the prior written consent of PRNewswire. PRNewswire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

@BigDataExpo Stories
Thanks to Docker, it becomes very easy to leverage containers to build, ship, and run any Linux application on any kind of infrastructure. Docker is particularly helpful for microservice architectures because their successful implementation relies on a fast, efficient deployment mechanism – which is precisely one of the features of Docker. Microservice architectures are therefore becoming more popular, and are increasingly seen as an interesting option even for smaller projects, instead of bein...
When it comes to the Internet of Things, hooking up will get you only so far. If you want customers to commit, you need to go beyond simply connecting products. You need to use the devices themselves to transform how you engage with every customer and how you manage the entire product lifecycle. In his session at @ThingsExpo, Sean Lorenz, Technical Product Manager for Xively at LogMeIn, will show how “product relationship management” can help you leverage your connected devices and the data th...
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @Things...
SYS-CON Events announced today that Open Data Centers (ODC), a carrier-neutral colocation provider, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place June 9-11, 2015, at the Javits Center in New York City, NY. Open Data Centers is a carrier-neutral data center operator in New Jersey and New York City offering alternative connectivity options for carriers, service providers and enterprise customers.
There’s Big Data, then there’s really Big Data from the Internet of Things. IoT is evolving to include many data possibilities like new types of event, log and network data. The volumes are enormous, generating tens of billions of logs per day, which raise data challenges. Early IoT deployments are relying heavily on both the cloud and managed service providers to navigate these challenges. Learn about IoT, Big Data and deployments processing massive data volumes from wearables, utilities and ot...
SYS-CON Events announced today that CodeFutures, a leading supplier of database performance tools, has been named a “Sponsor” of SYS-CON's 16th International Cloud Expo®, which will take place on June 9–11, 2015, at the Javits Center in New York, NY. CodeFutures is an independent software vendor focused on providing tools that deliver database performance tools that increase productivity during database development and increase database performance and scalability during production.
Even as cloud and managed services grow increasingly central to business strategy and performance, challenges remain. The biggest sticking point for companies seeking to capitalize on the cloud is data security. Keeping data safe is an issue in any computing environment, and it has been a focus since the earliest days of the cloud revolution. Understandably so: a lot can go wrong when you allow valuable information to live outside the firewall. Recent revelations about government snooping, along...
In his session at DevOps Summit, Tapabrata Pal, Director of Enterprise Architecture at Capital One, will tell a story about how Capital One has embraced Agile and DevOps Security practices across the Enterprise – driven by Enterprise Architecture; bringing in Development, Operations and Information Security organizations together. Capital Ones DevOpsSec practice is based upon three "pillars" – Shift-Left, Automate Everything, Dashboard Everything. Within about three years, from 100% waterfall, C...
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @Things...
SYS-CON Events announced today that Intelligent Systems Services will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Established in 1994, Intelligent Systems Services Inc. is located near Washington, DC, with representatives and partners nationwide. ISS’s well-established track record is based on the continuous pursuit of excellence in designing, implementing and supporting nationwide clients’ mission-cri...
The major cloud platforms defy a simple, side-by-side analysis. Each of the major IaaS public-cloud platforms offers their own unique strengths and functionality. Options for on-site private cloud are diverse as well, and must be designed and deployed while taking existing legacy architecture and infrastructure into account. Then the reality is that most enterprises are embarking on a hybrid cloud strategy and programs. In this Power Panel at 15th Cloud Expo (http://www.CloudComputingExpo.com...
Data-intensive companies that strive to gain insights from data using Big Data analytics tools can gain tremendous competitive advantage by deploying data-centric storage. Organizations generate large volumes of data, the vast majority of which is unstructured. As the volume and velocity of this unstructured data increases, the costs, risks and usability challenges associated with managing the unstructured data (regardless of file type, size or device) increases simultaneously, including end-to-...
The excitement around the possibilities enabled by Big Data is being tempered by the daunting task of feeding the analytics engines with high quality data on a continuous basis. As the once distinct fields of data integration and data management increasingly converge, cloud-based data solutions providers have emerged that can buffer your organization from the complexities of this continuous data cleansing and management so that you’re free to focus on the end goal: actionable insight.
DevOps tends to focus on the relationship between Dev and Ops, putting an emphasis on the ops and application infrastructure. But that’s changing with microservices architectures. In her session at DevOps Summit, Lori MacVittie, Evangelist for F5 Networks, will focus on how microservices are changing the underlying architectures needed to scale, secure and deliver applications based on highly distributed (micro) services and why that means an expansion into “the network” for DevOps.
When it comes to the Internet of Things, hooking up will get you only so far. If you want customers to commit, you need to go beyond simply connecting products. You need to use the devices themselves to transform how you engage with every customer and how you manage the entire product lifecycle. In his session at @ThingsExpo, Sean Lorenz, Technical Product Manager for Xively at LogMeIn, will show how “product relationship management” can help you leverage your connected devices and the data th...
With several hundred implementations of IoT-enabled solutions in the past 12 months alone, this session will focus on experience over the art of the possible. Many can only imagine the most advanced telematics platform ever deployed, supporting millions of customers, producing tens of thousands events or GBs per trip, and hundreds of TBs per month. With the ability to support a billion sensor events per second, over 30PB of warm data for analytics, and hundreds of PBs for an data analytics arc...
The Internet of Things (IoT) is causing data centers to become radically decentralized and atomized within a new paradigm known as “fog computing.” To support IoT applications, such as connected cars and smart grids, data centers' core functions will be decentralized out to the network's edges and endpoints (aka “fogs”). As this trend takes hold, Big Data analytics platforms will focus on high-volume log analysis (aka “logs”) and rely heavily on cognitive-computing algorithms (aka “cogs”) to mak...
The Internet of Everything (IoE) brings together people, process, data and things to make networked connections more relevant and valuable than ever before – transforming information into knowledge and knowledge into wisdom. IoE creates new capabilities, richer experiences, and unprecedented opportunities to improve business and government operations, decision making and mission support capabilities. In his session at @ThingsExpo, Gary Hall, Chief Technology Officer, Federal Defense at Cisco S...
Operational Hadoop and the Lambda Architecture for Streaming Data Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing...
We’re no longer looking to the future for the IoT wave. It’s no longer a distant dream but a reality that has arrived. It’s now time to make sure the industry is in alignment to meet the IoT growing pains – cooperate and collaborate as well as innovate. In his session at @ThingsExpo, Jim Hunter, Chief Scientist & Technology Evangelist at Greenwave Systems, will examine the key ingredients to IoT success and identify solutions to challenges the industry is facing. The deep industry expertise be...