Welcome!

@BigDataExpo Authors: Yeshim Deniz, Elizabeth White, Pat Romanski, Liz McMillan, Jason Bloomberg

News Feed Item

Continuuity, AT&T Labs to Open Source Real-Time Data Processing Framework

Disruptive New Technology Will Enable Integrated, High-Quality and Consistent Streaming Analytics

PALO ALTO, CA -- (Marketwired) -- 06/03/14 -- Continuuity, creator of the industry's first Big Data Application Server for Apache Hadoop™, and AT&T Labs today announced plans to release into open source a disruptive new technology that will provide an integrated, high-quality, and consistent streaming analytics capability. Initially code-named jetStream, it will be made available to the market via open source in the third quarter of 2014.

jetStream will offer a simple, efficient and cost-effective way for businesses, OEMs/ISVs, system integrators, service providers, and developers to create a diverse range of big data analytics and streaming applications that address a broad set of business use cases such as, network intrusion detection and analytics, real-time analysis for spam filtering, social media market analysis, location analytics, and real-time recommendation engines that match relevant content to the right users at the right time.

"Continuuity's mission is to help enterprises to make their businesses truly data-driven. Given the wealth of data being consumed and processed, the ability to make informed, real-time decisions with data is critical," said Jonathan Gray, Founder & CEO of Continuuity. "Our collaboration with AT&T to open source jetStream is consistent with our vision of becoming the de facto platform for easily building applications powered by data and operated at scale."

To create jetStream, Continuuity joined forces with AT&T Labs to integrate Continuuity BigFlow, a distributed framework for building durable high throughput data processing applications, with AT&T's streaming analytics tool -- an extremely fast, low-latency streaming analytics database originally built out of the necessity for managing its network at scale.

jetStream brings together the complementary functionality of BigFlow and AT&T's streaming analytics tool to create a unified real-time framework that supports in-memory stream processing and model-based event processing with direct integration for a variety of data systems including Apache HBase and HDFS. By combining AT&T low-latency and declarative language support with BigFlow durable, high throughput computing capabilities and procedural language support, jetStream provides developers with a disruptive new way to take in and store vast quantities of data, build massively scalable applications, and update applications in real-time as new data is ingested. Specifically, developers will be able to do the following:

  • Direct integration of real-time data feeds and processing applications with Hadoop and HBase and utilization of YARN for deployment and resource management
  • Framework-level correctness, fault tolerance guarantees, and application logic scalability that reduces friction, errors, and bugs during development
  • A transaction engine that provides delivery, isolation and consistency guarantees that enable exactly-once processing semantics
  • Scalability without increased operational cost of building and maintaining applications

"From financial services to global network management, there are industries with a need to conduct real-time business; this requires access to a real-time streaming analytics technology that also provides consistency, speed and scalability," said Christopher W. Rice, Vice President of Advanced Technologies at AT&T Labs. "We believe that making this technology available in the market will advance the industry, broaden the supplier base, and lower the cost of such technology. Putting this into open-source makes available the required tools for developing data-intensive, complex applications accessible to a broader base of developers across businesses and partners of all sizes."

For more information, please visit jetStream.io or www.research.att.com/projects.

About AT&T
AT&T Inc. (NYSE:T) is a premier communications holding company and one of the most honored companies in the world. Its subsidiaries and affiliates -- AT&T operating companies -- are the providers of AT&T services in the United States and internationally. With a powerful array of network resources that includes the nation's most reliable 4G LTE network, AT&T is a leading provider of wireless, Wi-Fi, high speed Internet, voice and cloud-based services. A leader in mobile Internet, AT&T also offers the best wireless coverage worldwide of any U.S. carrier, offering the most wireless phones that work in the most countries. It also offers advanced TV service with the AT&T U-verse® brand. The company's suite of IP-based business communications services is one of the most advanced in the world.

Additional information about AT&T Inc. and the products and services provided by AT&T subsidiaries and affiliates is available at http://about.att.com or follow our news on Twitter at @ATT, on Facebook at http://www.facebook.com/att and YouTube at http://www.youtube.com/att.

© 2014 AT&T Intellectual Property. All rights reserved. AT&T, the AT&T logo and all other marks contained herein are trademarks of AT&T Intellectual Property and/or AT&T affiliated companies. All other marks contained herein are the property of their respective owners.

Reliability claim based on data transfer completion rates on nationwide 4G LTE networks. 4G LTE availability varies.

About Continuuity
Continuuity makes it easy for any Java developer to build and manage data applications in the cloud or on-premise. Continuuity Reactor, its flagship product, is the industry's first Big Data Application Server for Apache Hadoop™. Based in Palo Alto, Calif., the company is backed by leading investors including Battery Ventures, Andreessen Horowitz and Ignition Partners. Learn more at www.continuuity.com.

More Stories By Marketwired .

Copyright © 2009 Marketwired. All rights reserved. All the news releases provided by Marketwired are copyrighted. Any forms of copying other than an individual user's personal reference without express written permission is prohibited. Further distribution of these materials is strictly forbidden, including but not limited to, posting, emailing, faxing, archiving in a public database, redistributing via a computer network or in a printed form.

@BigDataExpo Stories
SYS-CON Events announced today that Enzu will exhibit at SYS-CON's 21st Int\ernational Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Enzu’s mission is to be the leading provider of enterprise cloud solutions worldwide. Enzu enables online businesses to use its IT infrastructure to their competitive advantage. By offering a suite of proven hosting and management services, Enzu wants companies to focus on the core of their ...
You know you need the cloud, but you’re hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You’re looking at private cloud solutions based on hyperconverged infrastructure, but you’re concerned with the limits inherent in those technologies.
You know you need the cloud, but you’re hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You’re looking at private cloud solutions based on hyperconverged infrastructure, but you’re concerned with the limits inherent in those technologies.
Both SaaS vendors and SaaS buyers are going “all-in” to hyperscale IaaS platforms such as AWS, which is disrupting the SaaS value proposition. Why should the enterprise SaaS consumer pay for the SaaS service if their data is resident in adjacent AWS S3 buckets? If both SaaS sellers and buyers are using the same cloud tools, automation and pay-per-transaction model offered by IaaS platforms, then why not host the “shrink-wrapped” software in the customers’ cloud? Further, serverless computing, cl...
Automation is enabling enterprises to design, deploy, and manage more complex, hybrid cloud environments. Yet the people who manage these environments must be trained in and understanding these environments better than ever before. A new era of analytics and cognitive computing is adding intelligence, but also more complexity, to these cloud environments. How smart is your cloud? How smart should it be? In this power panel at 20th Cloud Expo, moderated by Conference Chair Roger Strukhoff, paneli...
In his session at @ThingsExpo, Eric Lachapelle, CEO of the Professional Evaluation and Certification Board (PECB), provided an overview of various initiatives to certify the security of connected devices and future trends in ensuring public trust of IoT. Eric Lachapelle is the Chief Executive Officer of the Professional Evaluation and Certification Board (PECB), an international certification body. His role is to help companies and individuals to achieve professional, accredited and worldwide re...
IoT solutions exploit operational data generated by Internet-connected smart “things” for the purpose of gaining operational insight and producing “better outcomes” (for example, create new business models, eliminate unscheduled maintenance, etc.). The explosive proliferation of IoT solutions will result in an exponential growth in the volume of IoT data, precipitating significant Information Governance issues: who owns the IoT data, what are the rights/duties of IoT solutions adopters towards t...
With the introduction of IoT and Smart Living in every aspect of our lives, one question has become relevant: What are the security implications? To answer this, first we have to look and explore the security models of the technologies that IoT is founded upon. In his session at @ThingsExpo, Nevi Kaja, a Research Engineer at Ford Motor Company, discussed some of the security challenges of the IoT infrastructure and related how these aspects impact Smart Living. The material was delivered interac...
The current age of digital transformation means that IT organizations must adapt their toolset to cover all digital experiences, beyond just the end users’. Today’s businesses can no longer focus solely on the digital interactions they manage with employees or customers; they must now contend with non-traditional factors. Whether it's the power of brand to make or break a company, the need to monitor across all locations 24/7, or the ability to proactively resolve issues, companies must adapt to...
With major technology companies and startups seriously embracing Cloud strategies, now is the perfect time to attend 21st Cloud Expo October 31 - November 2, 2017, at the Santa Clara Convention Center, CA, and June 12-14, 2018, at the Javits Center in New York City, NY, and learn what is going on, contribute to the discussions, and ensure that your enterprise is on the right path to Digital Transformation.
The taxi industry never saw Uber coming. Startups are a threat to incumbents like never before, and a major enabler for startups is that they are instantly “cloud ready.” If innovation moves at the pace of IT, then your company is in trouble. Why? Because your data center will not keep up with frenetic pace AWS, Microsoft and Google are rolling out new capabilities. In his session at 20th Cloud Expo, Don Browning, VP of Cloud Architecture at Turner, posited that disruption is inevitable for comp...
No hype cycles or predictions of zillions of things here. IoT is big. You get it. You know your business and have great ideas for a business transformation strategy. What comes next? Time to make it happen. In his session at @ThingsExpo, Jay Mason, Associate Partner at M&S Consulting, presented a step-by-step plan to develop your technology implementation strategy. He discussed the evaluation of communication standards and IoT messaging protocols, data analytics considerations, edge-to-cloud tec...
When growing capacity and power in the data center, the architectural trade-offs between server scale-up vs. scale-out continue to be debated. Both approaches are valid: scale-out adds multiple, smaller servers running in a distributed computing model, while scale-up adds fewer, more powerful servers that are capable of running larger workloads. It’s worth noting that there are additional, unique advantages that scale-up architectures offer. One big advantage is large memory and compute capacity...
New competitors, disruptive technologies, and growing expectations are pushing every business to both adopt and deliver new digital services. This ‘Digital Transformation’ demands rapid delivery and continuous iteration of new competitive services via multiple channels, which in turn demands new service delivery techniques – including DevOps. In this power panel at @DevOpsSummit 20th Cloud Expo, moderated by DevOps Conference Co-Chair Andi Mann, panelists examined how DevOps helps to meet the de...
"When we talk about cloud without compromise what we're talking about is that when people think about 'I need the flexibility of the cloud' - it's the ability to create applications and run them in a cloud environment that's far more flexible,” explained Matthew Finnie, CTO of Interoute, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Cloud applications are seeing a deluge of requests to support the exploding advanced analytics market. “Open analytics” is the emerging strategy to deliver that data through an open data access layer, in the cloud, to be directly consumed by external analytics tools and popular programming languages. An increasing number of data engineers and data scientists use a variety of platforms and advanced analytics languages such as SAS, R, Python and Java, as well as frameworks such as Hadoop and Spark...
Join us at Cloud Expo June 6-8 to find out how to securely connect your cloud app to any cloud or on-premises data source – without complex firewall changes. More users are demanding access to on-premises data from their cloud applications. It’s no longer a “nice-to-have” but an important differentiator that drives competitive advantages. It’s the new “must have” in the hybrid era. Users want capabilities that give them a unified view of the data to get closer to customers and grow business. The...
"Loom is applying artificial intelligence and machine learning into the entire log analysis process, from start to finish and at the end you will get a human touch,” explained Sabo Taylor Diab, Vice President, Marketing at Loom Systems, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
"We focus on composable infrastructure. Composable infrastructure has been named by companies like Gartner as the evolution of the IT infrastructure where everything is now driven by software," explained Bruno Andrade, CEO and Founder of HTBase, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Artificial intelligence, machine learning, neural networks. We’re in the midst of a wave of excitement around AI such as hasn’t been seen for a few decades. But those previous periods of inflated expectations led to troughs of disappointment. Will this time be different? Most likely. Applications of AI such as predictive analytics are already decreasing costs and improving reliability of industrial machinery. Furthermore, the funding and research going into AI now comes from a wide range of com...