|By Marketwired .||
|June 3, 2014 01:30 PM EDT||
PALO ALTO, CA -- (Marketwired) -- 06/03/14 -- Continuuity, creator of the industry's first Big Data Application Server for Apache Hadoop, and AT&T Labs today announced plans to release into open source a disruptive new technology that will provide an integrated, high-quality, and consistent streaming analytics capability. Initially code-named jetStream, it will be made available to the market via open source in the third quarter of 2014.
jetStream will offer a simple, efficient and cost-effective way for businesses, OEMs/ISVs, system integrators, service providers, and developers to create a diverse range of big data analytics and streaming applications that address a broad set of business use cases such as, network intrusion detection and analytics, real-time analysis for spam filtering, social media market analysis, location analytics, and real-time recommendation engines that match relevant content to the right users at the right time.
"Continuuity's mission is to help enterprises to make their businesses truly data-driven. Given the wealth of data being consumed and processed, the ability to make informed, real-time decisions with data is critical," said Jonathan Gray, Founder & CEO of Continuuity. "Our collaboration with AT&T to open source jetStream is consistent with our vision of becoming the de facto platform for easily building applications powered by data and operated at scale."
To create jetStream, Continuuity joined forces with AT&T Labs to integrate Continuuity BigFlow, a distributed framework for building durable high throughput data processing applications, with AT&T's streaming analytics tool -- an extremely fast, low-latency streaming analytics database originally built out of the necessity for managing its network at scale.
jetStream brings together the complementary functionality of BigFlow and AT&T's streaming analytics tool to create a unified real-time framework that supports in-memory stream processing and model-based event processing with direct integration for a variety of data systems including Apache HBase and HDFS. By combining AT&T low-latency and declarative language support with BigFlow durable, high throughput computing capabilities and procedural language support, jetStream provides developers with a disruptive new way to take in and store vast quantities of data, build massively scalable applications, and update applications in real-time as new data is ingested. Specifically, developers will be able to do the following:
- Direct integration of real-time data feeds and processing applications with Hadoop and HBase and utilization of YARN for deployment and resource management
- Framework-level correctness, fault tolerance guarantees, and application logic scalability that reduces friction, errors, and bugs during development
- A transaction engine that provides delivery, isolation and consistency guarantees that enable exactly-once processing semantics
- Scalability without increased operational cost of building and maintaining applications
"From financial services to global network management, there are industries with a need to conduct real-time business; this requires access to a real-time streaming analytics technology that also provides consistency, speed and scalability," said Christopher W. Rice, Vice President of Advanced Technologies at AT&T Labs. "We believe that making this technology available in the market will advance the industry, broaden the supplier base, and lower the cost of such technology. Putting this into open-source makes available the required tools for developing data-intensive, complex applications accessible to a broader base of developers across businesses and partners of all sizes."
AT&T Inc. (NYSE:T) is a premier communications holding company and one of the most honored companies in the world. Its subsidiaries and affiliates -- AT&T operating companies -- are the providers of AT&T services in the United States and internationally. With a powerful array of network resources that includes the nation's most reliable 4G LTE network, AT&T is a leading provider of wireless, Wi-Fi, high speed Internet, voice and cloud-based services. A leader in mobile Internet, AT&T also offers the best wireless coverage worldwide of any U.S. carrier, offering the most wireless phones that work in the most countries. It also offers advanced TV service with the AT&T U-verse® brand. The company's suite of IP-based business communications services is one of the most advanced in the world.
Additional information about AT&T Inc. and the products and services provided by AT&T subsidiaries and affiliates is available at http://about.att.com or follow our news on Twitter at @ATT, on Facebook at http://www.facebook.com/att and YouTube at http://www.youtube.com/att.
© 2014 AT&T Intellectual Property. All rights reserved. AT&T, the AT&T logo and all other marks contained herein are trademarks of AT&T Intellectual Property and/or AT&T affiliated companies. All other marks contained herein are the property of their respective owners.
Reliability claim based on data transfer completion rates on nationwide 4G LTE networks. 4G LTE availability varies.
Continuuity makes it easy for any Java developer to build and manage data applications in the cloud or on-premise. Continuuity Reactor, its flagship product, is the industry's first Big Data Application Server for Apache Hadoop. Based in Palo Alto, Calif., the company is backed by leading investors including Battery Ventures, Andreessen Horowitz and Ignition Partners. Learn more at www.continuuity.com.
The Internet of Things is tied together with a thin strand that is known as time. Coincidentally, at the core of nearly all data analytics is a timestamp. When working with time series data there are a few core principles that everyone should consider, especially across datasets where time is the common boundary. In his session at Internet of @ThingsExpo, Jim Scott, Director of Enterprise Strategy & Architecture at MapR Technologies, discussed single-value, geo-spatial, and log time series dat...
Jan. 31, 2015 05:45 AM EST Reads: 5,089
There is no doubt that Big Data is here and getting bigger every day. Building a Big Data infrastructure today is no easy task. There are an enormous number of choices for database engines and technologies. To make things even more challenging, requirements are getting more sophisticated, and the standard paradigm of supporting historical analytics queries is often just one facet of what is needed. As Big Data growth continues, organizations are demanding real-time access to data, allowing immed...
Jan. 31, 2015 03:00 AM EST Reads: 4,997
The 3rd International Internet of @ThingsExpo, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that its Call for Papers is now open. The Internet of Things (IoT) is the biggest idea since the creation of the Worldwide Web more than 20 years ago.
Jan. 31, 2015 02:00 AM EST Reads: 10,699 Replies: 1
In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect at GE, and Ibrahim Gokcen, who leads GE's advanced IoT analytics, focused on the Internet of Things / Industrial Internet and how to make it operational for business end-users. Learn about the challenges posed by machine and sensor data and how to marry it with enterprise data. They also discussed the tips and tricks to provide the Industrial Internet as an end-user consumable service using Big Data Analytics and Industrial C...
Jan. 31, 2015 01:00 AM EST Reads: 4,630
Things are being built upon cloud foundations to transform organizations. This CEO Power Panel at 15th Cloud Expo, moderated by Roger Strukhoff, Cloud Expo and @ThingsExpo conference chair, addressed the big issues involving these technologies and, more important, the results they will achieve. Rodney Rogers, chairman and CEO of Virtustream; Brendan O'Brien, co-founder of Aria Systems, Bart Copeland, president and CEO of ActiveState Software; Jim Cowie, chief scientist at Dyn; Dave Wagstaff, VP ...
Jan. 31, 2015 01:00 AM EST Reads: 4,392
How do APIs and IoT relate? The answer is not as simple as merely adding an API on top of a dumb device, but rather about understanding the architectural patterns for implementing an IoT fabric. There are typically two or three trends: Exposing the device to a management framework Exposing that management framework to a business centric logic Exposing that business layer and data to end users. This last trend is the IoT stack, which involves a new shift in the separation of what stuff happe...
Jan. 31, 2015 12:30 AM EST Reads: 5,012
The 4th International DevOps Summit, co-located with16th International Cloud Expo – being held June 9-11, 2015, at the Javits Center in New York City, NY – announces that its Call for Papers is now open. Born out of proven success in agile development, cloud computing, and process automation, DevOps is a macro trend you cannot afford to miss. From showcase success stories from early adopters and web-scale businesses, DevOps is expanding to organizations of all sizes, including the world's large...
Jan. 30, 2015 11:00 PM EST Reads: 5,395
The Industrial Internet revolution is now underway, enabled by connected machines and billions of devices that communicate and collaborate. The massive amounts of Big Data requiring real-time analysis is flooding legacy IT systems and giving way to cloud environments that can handle the unpredictable workloads. Yet many barriers remain until we can fully realize the opportunities and benefits from the convergence of machines and devices with Big Data and the cloud, including interoperability, ...
Jan. 30, 2015 10:00 PM EST Reads: 4,491
Technology is enabling a new approach to collecting and using data. This approach, commonly referred to as the "Internet of Things" (IoT), enables businesses to use real-time data from all sorts of things including machines, devices and sensors to make better decisions, improve customer service, and lower the risk in the creation of new revenue opportunities. In his General Session at Internet of @ThingsExpo, Dave Wagstaff, Vice President and Chief Architect at BSQUARE Corporation, discuss the ...
Jan. 30, 2015 03:45 PM EST Reads: 4,392
Agility is top of mind for Cloud/Service providers and Enterprises alike. Policy Driven Data Center provides a policy model for application deployment by decoupling application needs from the underlying infrastructure primitives. In his session at 15th Cloud Expo, David Klebanov, a Technical Solutions Architect with Cisco Systems, discussed how it differentiates from the software-defined top-down control by offering a declarative approach to allow faster and simpler application deployment. Davi...
Jan. 30, 2015 03:30 PM EST Reads: 5,278
The adoption of the Internet Of Things (IoT) is growing and its growth is synonymous with the growth of cloud. As per predictions from IDC: IoT and the Cloud: Within the next five years, more than 90% of all IoT data will be hosted on service provider platforms as cloud computing reduces the complexity of supporting IoT "Data Blending." This means that any organization that wanted to transform themselves using IoT has to automatically embrace the cloud too, especially the public cloud. This b...
Jan. 30, 2015 03:15 PM EST Reads: 1,123
Cloud Expo 2014 TV commercials will feature @ThingsExpo, which was launched in June, 2014 at New York City's Javits Center as the largest 'Internet of Things' event in the world.
Jan. 30, 2015 03:15 PM EST Reads: 5,165
“We help people build clusters, in the classical sense of the cluster. We help people put a full stack on top of every single one of those machines. We do the full bare metal install," explained Greg Bruno, Vice President of Engineering and co-founder of StackIQ, in this SYS-CON.tv interview at 15th Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Jan. 30, 2015 02:45 PM EST Reads: 4,236
In this demo at 15th Cloud Expo, John Meza, Product Engineer at Esri, showed how Esri products hook into Hadoop cluster to allow you to do spatial analysis on the spatial data within your cluster, and he demonstrated rendering from a data center with ArcGIS Pro, a new product that has a brand new rendering engine.
Jan. 30, 2015 02:30 PM EST Reads: 2,933
"People are a lot more knowledgeable about APIs now. There are two types of people who work with APIs - IT people who want to use APIs for something internal and the product managers who want to do something outside APIs for people to connect to them," explained Roberto Medrano, Executive Vice President at SOA Software, in this SYS-CON.tv interview at Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Jan. 30, 2015 02:30 PM EST Reads: 4,567
The Software Defined Data Center (SDDC), which enables organizations to seamlessly run in a hybrid cloud model (public + private cloud), is here to stay. IDC estimates that the software-defined networking market will be valued at $3.7 billion by 2016. Security is a key component and benefit of the SDDC, and offers an opportunity to build security 'from the ground up' and weave it into the environment from day one. In his session at 16th Cloud Expo, Reuven Harrison, CTO and Co-Founder of Tufin,...
Jan. 30, 2015 02:15 PM EST Reads: 2,676
Performance is the intersection of power, agility, control, and choice. If you value performance, and more specifically consistent performance, you need to look beyond simple virtualized compute. Many factors need to be considered to create a truly performant environment. In his General Session at 15th Cloud Expo, Harold Hannon, Sr. Software Architect at SoftLayer, discussed how to take advantage of a multitude of compute options and platform features to make cloud the cornerstone of your onlin...
Jan. 30, 2015 02:15 PM EST Reads: 5,305
Hardware will never be more valuable than on the day it hits your loading dock. Each day new servers are not deployed to production the business is losing money. While Moore's Law is typically cited to explain the exponential density growth of chips, a critical consequence of this is rapid depreciation of servers. The hardware for clustered systems (e.g., Hadoop, OpenStack) tends to be significant capital expenses. In his session at Big Data Expo, Mason Katz, CTO and co-founder of StackIQ, disc...
Jan. 30, 2015 02:00 PM EST Reads: 5,042
Software Defined Storage provides many benefits for customers including agility, flexibility, faster adoption of new technology and cost effectiveness. However, for IT organizations it can be challenging and complex to build your Enterprise Grade Storage from software. In his session at Cloud Expo, Paul Turner, CMO at Cloudian, looked at the new Original Design Manufacturer (ODM) market and how it is changing the storage world. Now Software Defined Storage companies can build Enterprise grade ...
Jan. 30, 2015 02:00 PM EST Reads: 3,896
Can the spatial component of your Big Data be harnessed and visualized, adding another dimension of power and analytics to your data? In his session at Big Data Expo®, John Meza, Product Engineer and Performance Engineering Team Lead at Esri, discussed the spatial queries that can be used within the Hadoop ecosystem and their integration with GeoSpatial applications. The GIS Tools for Hadoop project was also discussed and its implementation to discover location-based patterns and relationships...
Jan. 30, 2015 01:45 PM EST Reads: 2,571