Click here to close now.

Welcome!

Big Data Journal Authors: Carmen Gonzalez, Ian Khan, Harry Trott, Elizabeth White, Bart Copeland

News Feed Item

Oracle Database In-Memory Powers the Real-Time Enterprise

Larry Ellison Unveils Breakthrough Technology, Which Turns the Promise of Real-Time Into a Reality

REDWOOD SHORES, CA -- (Marketwired) -- 06/10/14 -- Oracle (NYSE: ORCL)

News Summary

In today's fast-paced, hyper-connected, and mobile/social world, businesses demand instantaneous information and responsiveness. In this environment, businesses must be able to move as fast as their customers, be they B2B or B2C, to deliver the experience those customers demand.

For years, technology companies have been talking about the "real-time" enterprise. And for years, that's all those vendors delivered -- talk -- because they didn't have the necessary range of world-class technologies to deliver on the real-time promise. But today, Oracle is changing that paradigm, because only Oracle can bring together for customers optimized in-memory capabilities across applications, middleware, databases, and systems. Oracle Database In-Memory transparently extends the power of Oracle Database 12c to enable organizations to discover business insights in real-time while simultaneously increasing transactional performance. With Oracle Database In-Memory, users can get immediate answers to business questions that previously took hours to obtain and are able to deliver a faster, better experience to both their internal and external constituents.

Oracle Database In-Memory delivers leading-edge in-memory performance without the need to restrict functionality or accept compromises, complexity and risk. Deploying Oracle Database In-Memory with virtually any existing Oracle Database-compatible application is as easy as flipping a switch -- no application changes are required. It is fully integrated with Oracle Database's renowned scale-up, scale-out, storage tiering, availability and security technologies making it the most industrial-strength offering in the industry.

At a special event at Oracle's headquarters, CEO Larry Ellison described how the ability to combine real-time data analysis with sub-second transactions on existing applications enables organizations to become Real-Time Enterprises that quickly make data-driven decisions, respond instantly to customers' demands, and continuously optimize key processes.

What Customers are Saying

  • "As a consumer Internet pioneer and innovator, Yahoo is always at the leading edge of big data and database technology to deliver a responsive, seamless consumer experience. We joined Oracle's beta program to understand how memory optimization could sharpen our big data processing," said Sudhi Vijayakumar, Yahoo's Principal Oracle Database Architect. "Full support for Oracle Real Application Clusters' scale-out capabilities means Oracle Database In-Memory can be used even on our largest data warehouses."

News Facts

  • Oracle Database In-Memory enables customers to accelerate database performance by orders of magnitude for analytics, data warehousing, and reporting while also speeding up online transaction processing (OLTP).
  • An innovative, dual-format in-memory architecture combines the best of row format and column format to simultaneously deliver fast analytics and efficient OLTP.
  • Oracle Database In-Memory allows any existing Oracle Database-compatible application to automatically and transparently take advantage of columnar in-memory processing, without additional programming or application changes.
  • Oracle Database In-Memory demonstrated from 100x to more than 1000x speedup for enterprise application modules in performance tests, including Oracle E-Business Suite, Oracle's JD Edwards, Oracle's PeopleSoft, Oracle's Siebel, and Oracle Fusion Applications.
  • The ability to combine real-time data analysis with sub-second transactions on existing applications enables organizations to become Real-Time Enterprises that quickly make data-driven decisions, respond instantly to customers' demands, and continuously optimize all key processes.
  • Oracle Database In-Memory has undergone extensive validation testing by hundreds of end-users, ISV partners, and Oracle Applications teams over the past nine months.
  • Oracle Database In-Memory is scheduled for general availability in July and can be used with all hardware platforms on which Oracle Database 12c is supported.
  • Oracle PartnerNetwork (OPN) is also announcing that Oracle Database 12c Ready certification will soon include Oracle Database In-Memory.

Software and Hardware Engineered for the Real-Time Enterprise

  • Building on years of innovations and maturity, Oracle Database In-Memory inherits all Oracle Database capabilities including:
    • Maximum Availability Architecture to protect against data loss and downtime.
    • Industry leading security technologies.
    • Scalability to meet any requirement via scale-up on large SMP servers, scale-out across a cluster of servers, and storage-tiering, to cost effectively run databases of any size -- whether petabyte-scale data warehouses, big data processing or database clouds.
    • Rich programmability: Java, R, Big Data, PHP Python, Node, REST, Ruby, etc.
    • Full data type support: relational, objects, XML, text, spatial, and new integrated JSON support.
  • Oracle Engineered Systems are the ideal complement to Oracle Database In-Memory:
    • Oracle Engineered Systems, including Oracle Exadata Database Machine and Oracle SuperCluster, are optimized for Oracle Database In-Memory, featuring large memory capacity, extreme performance, and high availability while tiering less active data to flash and disk to deliver outstanding cost effectiveness.
    • In-Memory fault tolerance on Oracle Engineered Systems optionally duplicates in-memory data across nodes enabling queries to instantly use a copy of in-memory data if a server fails. New Direct-to-Wire Infiniband accelerates scale-out for in-memory.
    • Oracle's M6-32 Big Memory Machine is the most powerful scale-up platform for Oracle Database In-Memory providing up to 32 Terabytes of DRAM memory and 3 terabytes/sec of memory bandwidth for maximum in-memory performance.

Supporting Quotes

  • "We are delighted that our MicroStrategy Analytics Platform is among the first third-party applications to be certified with Oracle Database In-Memory," explained Paul Zolfaghari, President, MicroStrategy Incorporated. "Our participation in Oracle's beta program and integration with Oracle Database In-Memory builds on our long-standing relationship with Oracle, underscoring the importance of working together to optimize our platforms to extend the advanced functionality and speed performance improvements to our joint customers."
  • "Oracle is the only vendor in the industry to embrace in-memory computing from applications to middleware to database to systems, enabling businesses to maximize profitability by accelerating operations, quickly discovering new growth opportunities and making smarter, real-time decisions," said Andrew Mendelsohn, Executive Vice President, Database Server Technologies, Oracle. "Oracle Database 12c In-Memory uniquely delivers unprecedented performance for virtually all workloads with 100 percent application transparency and no data migration. Plus all the high availability, scalability, and security that customers have come to expect from the Oracle Database are fully preserved."
  • "Oracle Applications provide the foundation for our customers' mission-critical business operations, including sales, financials, supply chain and human resources. By raising the bar on speed, Oracle Database In-Memory enables customers to compound the value of their existing applications by deriving new insights and business opportunities faster," said Steve Miranda, Executive Vice President of Application Development, Oracle.

Supporting Resources

About Oracle
Oracle engineers hardware and software to work together in the cloud and in your data center. For more information about Oracle (NYSE: ORCL), visit www.oracle.com.

Trademarks
Oracle and Java are registered trademarks of Oracle and/or its affiliates. Other names may be trademarks of their respective owners.

Safe Harbor
The following is intended to outline our general product direction. It is intended for information purposes only, and may not be incorporated into any contract. It is not a commitment to deliver any material, code, or functionality, and should not be relied upon in making purchasing decisions. The development, release, and timing of any features or functionality described for Oracle's products remains at the sole discretion of Oracle Corporation.

PDF Attachment Available: http://media.marketwire.com/attachments/201406/76191_DBIM_ComparChart_Vert.pdf

Image Available: http://www2.marketwire.com/mw/frame_mw?attachid=2613727

Contact Info

Letty Ledbetter
Oracle
+1.650.506.8071
Email Contact

Teri Whitaker
Oracle
+1.650.506.9914
Email Contact

More Stories By Marketwired .

Copyright © 2009 Marketwired. All rights reserved. All the news releases provided by Marketwired are copyrighted. Any forms of copying other than an individual user's personal reference without express written permission is prohibited. Further distribution of these materials is strictly forbidden, including but not limited to, posting, emailing, faxing, archiving in a public database, redistributing via a computer network or in a printed form.

@BigDataExpo Stories
The industrial software market has treated data with the mentality of “collect everything now, worry about how to use it later.” We now find ourselves buried in data, with the pervasive connectivity of the (Industrial) Internet of Things only piling on more numbers. There’s too much data and not enough information. In his session at @ThingsExpo, Bob Gates, Global Marketing Director, GE’s Intelligent Platforms business, to discuss how realizing the power of IoT, software developers are now focu...
Operational Hadoop and the Lambda Architecture for Streaming Data Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing...
SYS-CON Events announced today that Vitria Technology, Inc. will exhibit at SYS-CON’s @ThingsExpo, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Vitria will showcase the company’s new IoT Analytics Platform through live demonstrations at booth #330. Vitria’s IoT Analytics Platform, fully integrated and powered by an operational intelligence engine, enables customers to rapidly build and operationalize advanced analytics to deliver timely business outcomes ...
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @Things...
SYS-CON Events announced today that Open Data Centers (ODC), a carrier-neutral colocation provider, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place June 9-11, 2015, at the Javits Center in New York City, NY. Open Data Centers is a carrier-neutral data center operator in New Jersey and New York City offering alternative connectivity options for carriers, service providers and enterprise customers.
Thanks to Docker, it becomes very easy to leverage containers to build, ship, and run any Linux application on any kind of infrastructure. Docker is particularly helpful for microservice architectures because their successful implementation relies on a fast, efficient deployment mechanism – which is precisely one of the features of Docker. Microservice architectures are therefore becoming more popular, and are increasingly seen as an interesting option even for smaller projects, instead of bein...
Even as cloud and managed services grow increasingly central to business strategy and performance, challenges remain. The biggest sticking point for companies seeking to capitalize on the cloud is data security. Keeping data safe is an issue in any computing environment, and it has been a focus since the earliest days of the cloud revolution. Understandably so: a lot can go wrong when you allow valuable information to live outside the firewall. Recent revelations about government snooping, along...
In his session at DevOps Summit, Tapabrata Pal, Director of Enterprise Architecture at Capital One, will tell a story about how Capital One has embraced Agile and DevOps Security practices across the Enterprise – driven by Enterprise Architecture; bringing in Development, Operations and Information Security organizations together. Capital Ones DevOpsSec practice is based upon three "pillars" – Shift-Left, Automate Everything, Dashboard Everything. Within about three years, from 100% waterfall, C...
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @Things...
Data-intensive companies that strive to gain insights from data using Big Data analytics tools can gain tremendous competitive advantage by deploying data-centric storage. Organizations generate large volumes of data, the vast majority of which is unstructured. As the volume and velocity of this unstructured data increases, the costs, risks and usability challenges associated with managing the unstructured data (regardless of file type, size or device) increases simultaneously, including end-to-...
The excitement around the possibilities enabled by Big Data is being tempered by the daunting task of feeding the analytics engines with high quality data on a continuous basis. As the once distinct fields of data integration and data management increasingly converge, cloud-based data solutions providers have emerged that can buffer your organization from the complexities of this continuous data cleansing and management so that you’re free to focus on the end goal: actionable insight.
With several hundred implementations of IoT-enabled solutions in the past 12 months alone, this session will focus on experience over the art of the possible. Many can only imagine the most advanced telematics platform ever deployed, supporting millions of customers, producing tens of thousands events or GBs per trip, and hundreds of TBs per month. With the ability to support a billion sensor events per second, over 30PB of warm data for analytics, and hundreds of PBs for an data analytics arc...
The Internet of Things (IoT) is causing data centers to become radically decentralized and atomized within a new paradigm known as “fog computing.” To support IoT applications, such as connected cars and smart grids, data centers' core functions will be decentralized out to the network's edges and endpoints (aka “fogs”). As this trend takes hold, Big Data analytics platforms will focus on high-volume log analysis (aka “logs”) and rely heavily on cognitive-computing algorithms (aka “cogs”) to mak...
Operational Hadoop and the Lambda Architecture for Streaming Data Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing...
When it comes to the Internet of Things, hooking up will get you only so far. If you want customers to commit, you need to go beyond simply connecting products. You need to use the devices themselves to transform how you engage with every customer and how you manage the entire product lifecycle. In his session at @ThingsExpo, Sean Lorenz, Technical Product Manager for Xively at LogMeIn, will show how “product relationship management” can help you leverage your connected devices and the data th...
One of the biggest impacts of the Internet of Things is and will continue to be on data; specifically data volume, management and usage. Companies are scrambling to adapt to this new and unpredictable data reality with legacy infrastructure that cannot handle the speed and volume of data. In his session at @ThingsExpo, Don DeLoach, CEO and president of Infobright, will discuss how companies need to rethink their data infrastructure to participate in the IoT, including: Data storage: Understand...
Since 2008 and for the first time in history, more than half of humans live in urban areas, urging cities to become “smart.” Today, cities can leverage the wide availability of smartphones combined with new technologies such as Beacons or NFC to connect their urban furniture and environment to create citizen-first services that improve transportation, way-finding and information delivery. In her session at @ThingsExpo, Laetitia Gazel-Anthoine, CEO of Connecthings, will focus on successful use c...
The true value of the Internet of Things (IoT) lies not just in the data, but through the services that protect the data, perform the analysis and present findings in a usable way. With many IoT elements rooted in traditional IT components, Big Data and IoT isn’t just a play for enterprise. In fact, the IoT presents SMBs with the prospect of launching entirely new activities and exploring innovative areas. CompTIA research identifies several areas where IoT is expected to have the greatest impac...
Roberto Medrano, Executive Vice President at SOA Software, had reached 30,000 page views on his home page - http://RobertoMedrano.SYS-CON.com/ - on the SYS-CON family of online magazines, which includes Cloud Computing Journal, Internet of Things Journal, Big Data Journal, and SOA World Magazine. He is a recognized executive in the information technology fields of SOA, internet security, governance, and compliance. He has extensive experience with both start-ups and large companies, having been ...
Companies today struggle to manage the types and volume of data their customers and employees generate and use every day. With billions of requests daily, operational consistency can be elusive. In his session at Big Data Expo, Dave McCrory, CTO at Basho Technologies, will explore how a distributed systems solution, such as NoSQL, can give organizations the consistency and availability necessary to succeed with on-demand data, offering high availability at massive scale.