Welcome!

@BigDataExpo Authors: Scott Allen, Elizabeth White, Liz McMillan, Dana Gardner, Pat Romanski

News Feed Item

Continuuity, AT&T Labs to Open Source Real-Time Data Processing Framework

Disruptive New Technology Will Enable Integrated, High-Quality and Consistent Streaming Analytics

PALO ALTO, CA -- (Marketwired) -- 06/03/14 -- Continuuity, creator of the industry's first Big Data Application Server for Apache Hadoop™, and AT&T Labs today announced plans to release into open source a disruptive new technology that will provide an integrated, high-quality, and consistent streaming analytics capability. Initially code-named jetStream, it will be made available to the market via open source in the third quarter of 2014.

jetStream will offer a simple, efficient and cost-effective way for businesses, OEMs/ISVs, system integrators, service providers, and developers to create a diverse range of big data analytics and streaming applications that address a broad set of business use cases such as, network intrusion detection and analytics, real-time analysis for spam filtering, social media market analysis, location analytics, and real-time recommendation engines that match relevant content to the right users at the right time.

"Continuuity's mission is to help enterprises to make their businesses truly data-driven. Given the wealth of data being consumed and processed, the ability to make informed, real-time decisions with data is critical," said Jonathan Gray, Founder & CEO of Continuuity. "Our collaboration with AT&T to open source jetStream is consistent with our vision of becoming the de facto platform for easily building applications powered by data and operated at scale."

To create jetStream, Continuuity joined forces with AT&T Labs to integrate Continuuity BigFlow, a distributed framework for building durable high throughput data processing applications, with AT&T's streaming analytics tool -- an extremely fast, low-latency streaming analytics database originally built out of the necessity for managing its network at scale.

jetStream brings together the complementary functionality of BigFlow and AT&T's streaming analytics tool to create a unified real-time framework that supports in-memory stream processing and model-based event processing with direct integration for a variety of data systems including Apache HBase and HDFS. By combining AT&T low-latency and declarative language support with BigFlow durable, high throughput computing capabilities and procedural language support, jetStream provides developers with a disruptive new way to take in and store vast quantities of data, build massively scalable applications, and update applications in real-time as new data is ingested. Specifically, developers will be able to do the following:

  • Direct integration of real-time data feeds and processing applications with Hadoop and HBase and utilization of YARN for deployment and resource management
  • Framework-level correctness, fault tolerance guarantees, and application logic scalability that reduces friction, errors, and bugs during development
  • A transaction engine that provides delivery, isolation and consistency guarantees that enable exactly-once processing semantics
  • Scalability without increased operational cost of building and maintaining applications

"From financial services to global network management, there are industries with a need to conduct real-time business; this requires access to a real-time streaming analytics technology that also provides consistency, speed and scalability," said Christopher W. Rice, Vice President of Advanced Technologies at AT&T Labs. "We believe that making this technology available in the market will advance the industry, broaden the supplier base, and lower the cost of such technology. Putting this into open-source makes available the required tools for developing data-intensive, complex applications accessible to a broader base of developers across businesses and partners of all sizes."

For more information, please visit jetStream.io or www.research.att.com/projects.

About AT&T
AT&T Inc. (NYSE:T) is a premier communications holding company and one of the most honored companies in the world. Its subsidiaries and affiliates -- AT&T operating companies -- are the providers of AT&T services in the United States and internationally. With a powerful array of network resources that includes the nation's most reliable 4G LTE network, AT&T is a leading provider of wireless, Wi-Fi, high speed Internet, voice and cloud-based services. A leader in mobile Internet, AT&T also offers the best wireless coverage worldwide of any U.S. carrier, offering the most wireless phones that work in the most countries. It also offers advanced TV service with the AT&T U-verse® brand. The company's suite of IP-based business communications services is one of the most advanced in the world.

Additional information about AT&T Inc. and the products and services provided by AT&T subsidiaries and affiliates is available at http://about.att.com or follow our news on Twitter at @ATT, on Facebook at http://www.facebook.com/att and YouTube at http://www.youtube.com/att.

© 2014 AT&T Intellectual Property. All rights reserved. AT&T, the AT&T logo and all other marks contained herein are trademarks of AT&T Intellectual Property and/or AT&T affiliated companies. All other marks contained herein are the property of their respective owners.

Reliability claim based on data transfer completion rates on nationwide 4G LTE networks. 4G LTE availability varies.

About Continuuity
Continuuity makes it easy for any Java developer to build and manage data applications in the cloud or on-premise. Continuuity Reactor, its flagship product, is the industry's first Big Data Application Server for Apache Hadoop™. Based in Palo Alto, Calif., the company is backed by leading investors including Battery Ventures, Andreessen Horowitz and Ignition Partners. Learn more at www.continuuity.com.

More Stories By Marketwired .

Copyright © 2009 Marketwired. All rights reserved. All the news releases provided by Marketwired are copyrighted. Any forms of copying other than an individual user's personal reference without express written permission is prohibited. Further distribution of these materials is strictly forbidden, including but not limited to, posting, emailing, faxing, archiving in a public database, redistributing via a computer network or in a printed form.

@BigDataExpo Stories
In addition to all the benefits, IoT is also bringing new kind of customer experience challenges - cars that unlock themselves, thermostats turning houses into saunas and baby video monitors broadcasting over the internet. This list can only increase because while IoT services should be intuitive and simple to use, the delivery ecosystem is a myriad of potential problems as IoT explodes complexity. So finding a performance issue is like finding the proverbial needle in the haystack.
Machine Learning helps make complex systems more efficient. By applying advanced Machine Learning techniques such as Cognitive Fingerprinting, wind project operators can utilize these tools to learn from collected data, detect regular patterns, and optimize their own operations. In his session at 18th Cloud Expo, Stuart Gillen, Director of Business Development at SparkCognition, discussed how research has demonstrated the value of Machine Learning in delivering next generation analytics to imp...
"We host and fully manage cloud data services, whether we store, the data, move the data, or run analytics on the data," stated Kamal Shannak, Senior Development Manager, Cloud Data Services, IBM, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Large scale deployments present unique planning challenges, system commissioning hurdles between IT and OT and demand careful system hand-off orchestration. In his session at @ThingsExpo, Jeff Smith, Senior Director and a founding member of Incenergy, will discuss some of the key tactics to ensure delivery success based on his experience of the last two years deploying Industrial IoT systems across four continents.
With the proliferation of both SQL and NoSQL databases, organizations can now target specific fit-for-purpose database tools for their different application needs regarding scalability, ease of use, ACID support, etc. Platform as a Service offerings make this even easier now, enabling developers to roll out their own database infrastructure in minutes with minimal management overhead. However, this same amount of flexibility also comes with the challenges of picking the right tool, on the right ...
Organizations planning enterprise data center consolidation and modernization projects are faced with a challenging, costly reality. Requirements to deploy modern, cloud-native applications simultaneously with traditional client/server applications are almost impossible to achieve with hardware-centric enterprise infrastructure. Compute and network infrastructure are fast moving down a software-defined path, but storage has been a laggard. Until now.
"This week we're really focusing on scalability, asset preservation and how do you back up to the cloud and in the cloud with object storage, which is really a new way of attacking dealing with your file, your blocked data, where you put it and how you access it," stated Jeff Greenwald, Senior Director of Market Development at HGST, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
DevOps at Cloud Expo – being held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA – announces that its Call for Papers is open. Born out of proven success in agile development, cloud computing, and process automation, DevOps is a macro trend you cannot afford to miss. From showcase success stories from early adopters and web-scale businesses, DevOps is expanding to organizations of all sizes, including the world's largest enterprises – and delivering real results. Am...
The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform. In his session at @ThingsExpo, Craig Sproule, CEO of Metavine, demonstrated how to move beyond today's coding paradigm and shared the must-have mindsets for removing complexity from the develo...
The 19th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Digital Transformation, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportuni...
“We're a global managed hosting provider. Our core customer set is a U.S.-based customer that is looking to go global,” explained Adam Rogers, Managing Director at ANEXIA, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Predictive analytics tools monitor, report, and troubleshoot in order to make proactive decisions about the health, performance, and utilization of storage. Most enterprises combine cloud and on-premise storage, resulting in blended environments of physical, virtual, cloud, and other platforms, which justifies more sophisticated storage analytics. In his session at 18th Cloud Expo, Peter McCallum, Vice President of Datacenter Solutions at FalconStor, discussed using predictive analytics to mon...
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with the 19th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world and ThingsExpo Silicon Valley Call for Papers is now open.
IoT is rapidly changing the way enterprises are using data to improve business decision-making. In order to derive business value, organizations must unlock insights from the data gathered and then act on these. In their session at @ThingsExpo, Eric Hoffman, Vice President at EastBanc Technologies, and Peter Shashkin, Head of Development Department at EastBanc Technologies, discussed how one organization leveraged IoT, cloud technology and data analysis to improve customer experiences and effi...
"We've discovered that after shows 80% if leads that people get, 80% of the conversations end up on the show floor, meaning people forget about it, people forget who they talk to, people forget that there are actual business opportunities to be had here so we try to help out and keep the conversations going," explained Jeff Mesnik, Founder and President of ContentMX, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
"When you think about the data center today, there's constant evolution, The evolution of the data center and the needs of the consumer of technology change, and they change constantly," stated Matt Kalmenson, VP of Sales, Service and Cloud Providers at Veeam Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Let’s face it, embracing new storage technologies, capabilities and upgrading to new hardware often adds complexity and increases costs. In his session at 18th Cloud Expo, Seth Oxenhorn, Vice President of Business Development & Alliances at FalconStor, discussed how a truly heterogeneous software-defined storage approach can add value to legacy platforms and heterogeneous environments. The result reduces complexity, significantly lowers cost, and provides IT organizations with improved efficienc...
With 15% of enterprises adopting a hybrid IT strategy, you need to set a plan to integrate hybrid cloud throughout your infrastructure. In his session at 18th Cloud Expo, Steven Dreher, Director of Solutions Architecture at Green House Data, discussed how to plan for shifting resource requirements, overcome challenges, and implement hybrid IT alongside your existing data center assets. Highlights included anticipating workload, cost and resource calculations, integrating services on both sides...
In his session at @DevOpsSummit at 19th Cloud Expo, Yoseph Reuveni, Director of Software Engineering at Jet.com, will discuss Jet.com's journey into containerizing Microsoft-based technologies like C# and F# into Docker. He will talk about lessons learned and challenges faced, the Mono framework tryout and how they deployed everything into Azure cloud. Yoseph Reuveni is a technology leader with unique experience developing and running high throughput (over 1M tps) distributed systems with extre...
Manufacturers are embracing the Industrial Internet the same way consumers are leveraging Fitbits – to improve overall health and wellness. Both can provide consistent measurement, visibility, and suggest performance improvements customized to help reach goals. Fitbit users can view real-time data and make adjustments to increase their activity. In his session at @ThingsExpo, Mark Bernardo Professional Services Leader, Americas, at GE Digital, discussed how leveraging the Industrial Internet a...