Welcome!

@DXWorldExpo Authors: Yeshim Deniz, Liz McMillan, Pat Romanski, Elizabeth White, Ed Featherston

Related Topics: @DXWorldExpo, Java IoT, Apache

@DXWorldExpo: Article

Apache Spark: A Key to Big Data Initiatives | @CloudExpo #Microservices

As with other data processing technologies, Spark is not suitable for all types of workloads

Apache Spark continues to gain a lot of traction as companies launch or expand their big data initiatives. There is no doubt that it’s finding a place in corporate IT strategies.

The open-source cluster computing framework was developed in the AMPLab at the University of California at Berkeley in 2009 and became an incubated project of the Apache Software Foundation in 2013. By early 2014, Spark had become one of the foundation’s top-level projects, and today it is one of the most active projects managed by Apache.

Because Spark was optimized to run in-memory, it is capable of processing data much faster than other approaches such as MapReduce. As a result, Spark can provide much higher performance levels for certain types of applications. By enabling programs to load data into a cluster's memory and query it repeatedly, the framework is ideal for machine learning algorithms.

As with other data processing technologies, Spark is not suitable for all types of workloads. But companies launching big data efforts can leverage the framework for a variety of projects, such as interactive queries across large data sets; the processing of streaming data from sensors, as with Internet of Things (IoT) applications; and machine learning tasks.

In addition, developers can use Spark to support other processing tasks, taking advantage of the open source framework’s huge set of developer libraries and application programming interfaces (APIs) and comprehensive support of popular languages such as Java, Python, R and Scala.

Apache Spark has three key things going for it that IT organizations should keep in mind:

  1. The framework’s relative simplicity. The APIs are designed specifically for interacting easily and rapidly with data at scale, and are structured in such a way that enable application developers to use Spark right away.
  2. The framework is designed for speed, operating both in-memory and on disk. Spark’s performance can be even greater when supporting interactive queries of data stored in memory.
  3. Spark supports multiple programming languages as mentioned above, and it includes native support for tight integration with leading storage solutions in the Hadoop ecosystem and beyond.

Spark is proving to be well suited for a number of business use cases and is helping companies to transform their big data initiatives and deliver analytics much faster and with greater efficiency.

One company, a provider of cloud-based predictive analytics software specifically designed for the telecommunications industry, is using the full Spark stack as part of its Hadoop-based architecture on MapR. This has helped the company achieve horizontal scalability on commodity hardware and reduce storage and computing costs.

The new technology stack allows the software company to continuously innovate and deliver value to its telecommunications customers by offering predictive insights from the cloud. Today’s telecommunications data has higher volumes and frequency and more complex structures, particularly with new types of devices generating data for IoT and the use of mobile phones for a fast-growing number of apps. The company needs to use this data to generate predictive insights using data science and predictive analytics, and Spark helps make this possible.

Another business benefiting from Spark is a global pharmaceuticals manufacturer that relies on big data solutions for drug discovery processes. One of the company’s areas of drug research requires lots of interaction with diverse data from external organizations.

Combined Spark and Hadoop workflow and integration layers enable the company’s researchers to leverage thousands of experiments other organizations have conducted, providing the pharmaceuticals company with a significant competitive advantage. The big data solutions the company uses allows it to integrate and analyze data so that it can speed up drug research.

These technologies are now being used for a variety of projects across the enterprise, including video analysis, proteomics, and meta-genomics. Researchers can access data directly through a Spark API on a number of databases with schemas that are designed for their specific analytics needs.

And a third business use case for Spark comes from a service provider that delivers analytics services to various industries. The company deployed the Spark framework in conjunction with its Hadoop big data initiative, and is able to dramatically cut query times and improve the accuracy of analytics results. That has enabled the company to provide enhanced services to its customers.

Clearly, Apache Spark can provide a number of benefits to organizations looking to get the most value out of their information resources and the biggest returns on their big data investments. The framework provides the speed and efficiency improvements companies need to deliver on the promise of big data and analytics.

To further explore the advantages of Spark, see the free interactive eBook, Getting Started with Apache Spark: From Inception to Productionby James A. Scott.

More Stories By Jim Scott

Jim has held positions running Operations, Engineering, Architecture and QA teams in the Consumer Packaged Goods, Digital Advertising, Digital Mapping, Chemical and Pharmaceutical industries. Jim has built systems that handle more than 50 billion transactions per day and his work with high-throughput computing at Dow Chemical was a precursor to more standardized big data concepts like Hadoop.

@BigDataExpo Stories
DXWorldEXPO LLC, the producer of the world's most influential technology conferences and trade shows has announced the 22nd International CloudEXPO | DXWorldEXPO "Early Bird Registration" is now open. Register for Full Conference "Gold Pass" ▸ Here (Expo Hall ▸ Here)
Evan Kirstel is an internationally recognized thought leader and social media influencer in IoT (#1 in 2017), Cloud, Data Security (2016), Health Tech (#9 in 2017), Digital Health (#6 in 2016), B2B Marketing (#5 in 2015), AI, Smart Home, Digital (2017), IIoT (#1 in 2017) and Telecom/Wireless/5G. His connections are a "Who's Who" in these technologies, He is in the top 10 most mentioned/re-tweeted by CMOs and CIOs (2016) and have been recently named 5th most influential B2B marketeer in the US. H...
Charles Araujo is an industry analyst, internationally recognized authority on the Digital Enterprise and author of The Quantum Age of IT: Why Everything You Know About IT is About to Change. As Principal Analyst with Intellyx, he writes, speaks and advises organizations on how to navigate through this time of disruption. He is also the founder of The Institute for Digital Transformation and a sought after keynote speaker. He has been a regular contributor to both InformationWeek and CIO Insight...
Join IBM November 1 at 21st Cloud Expo at the Santa Clara Convention Center in Santa Clara, CA, and learn how IBM Watson can bring cognitive services and AI to intelligent, unmanned systems. Cognitive analysis impacts today’s systems with unparalleled ability that were previously available only to manned, back-end operations. Thanks to cloud processing, IBM Watson can bring cognitive services and AI to intelligent, unmanned systems. Imagine a robot vacuum that becomes your personal assistant tha...
I think DevOps is now a rambunctious teenager - it's starting to get a mind of its own, wanting to get its own things but it still needs some adult supervision," explained Thomas Hooker, VP of marketing at CollabNet, in this SYS-CON.tv interview at DevOps Summit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
"With Digital Experience Monitoring what used to be a simple visit to a web page has exploded into app on phones, data from social media feeds, competitive benchmarking - these are all components that are only available because of some type of digital asset," explained Leo Vasiliou, Director of Web Performance Engineering at Catchpoint Systems, in this SYS-CON.tv interview at DevOps Summit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
The “Digital Era” is forcing us to engage with new methods to build, operate and maintain applications. This transformation also implies an evolution to more and more intelligent applications to better engage with the customers, while creating significant market differentiators. In both cases, the cloud has become a key enabler to embrace this digital revolution. So, moving to the cloud is no longer the question; the new questions are HOW and WHEN. To make this equation even more complex, most ...
"This week we're really focusing on scalability, asset preservation and how do you back up to the cloud and in the cloud with object storage, which is really a new way of attacking dealing with your file, your blocked data, where you put it and how you access it," stated Jeff Greenwald, Senior Director of Market Development at HGST, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Creating replica copies to tolerate a certain number of failures is easy, but very expensive at cloud-scale. Conventional RAID has lower overhead, but it is limited in the number of failures it can tolerate. And the management is like herding cats (overseeing capacity, rebuilds, migrations, and degraded performance). In his general session at 18th Cloud Expo, Scott Cleland, Senior Director of Product Marketing for the HGST Cloud Infrastructure Business Unit, discussed how a new approach is neces...
"ZeroStack is a startup in Silicon Valley. We're solving a very interesting problem around bringing public cloud convenience with private cloud control for enterprises and mid-size companies," explained Kamesh Pemmaraju, VP of Product Management at ZeroStack, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Cloud-enabled transformation has evolved from cost saving measure to business innovation strategy -- one that combines the cloud with cognitive capabilities to drive market disruption. Learn how you can achieve the insight and agility you need to gain a competitive advantage. Industry-acclaimed CTO and cloud expert, Shankar Kalyana presents. Only the most exceptional IBMers are appointed with the rare distinction of IBM Fellow, the highest technical honor in the company. Shankar has also receive...
Agile has finally jumped the technology shark, expanding outside the software world. Enterprises are now increasingly adopting Agile practices across their organizations in order to successfully navigate the disruptive waters that threaten to drown them. In our quest for establishing change as a core competency in our organizations, this business-centric notion of Agile is an essential component of Agile Digital Transformation. In the years since the publication of the Agile Manifesto, the conn...
Business professionals no longer wonder if they'll migrate to the cloud; it's now a matter of when. The cloud environment has proved to be a major force in transitioning to an agile business model that enables quick decisions and fast implementation that solidify customer relationships. And when the cloud is combined with the power of cognitive computing, it drives innovation and transformation that achieves astounding competitive advantage.
"Software-defined storage is a big problem in this industry because so many people have different definitions as they see fit to use it," stated Peter McCallum, VP of Datacenter Solutions at FalconStor Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Deep learning has been very successful in social sciences and specially areas where there is a lot of data. Trading is another field that can be viewed as social science with a lot of data. With the advent of Deep Learning and Big Data technologies for efficient computation, we are finally able to use the same methods in investment management as we would in face recognition or in making chat-bots. In his session at 20th Cloud Expo, Gaurav Chakravorty, co-founder and Head of Strategy Development ...
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, discussed the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
As organizations shift towards IT-as-a-service models, the need for managing and protecting data residing across physical, virtual, and now cloud environments grows with it. Commvault can ensure protection, access and E-Discovery of your data – whether in a private cloud, a Service Provider delivered public cloud, or a hybrid cloud environment – across the heterogeneous enterprise. In his general session at 18th Cloud Expo, Randy De Meno, Chief Technologist - Windows Products and Microsoft Part...
"Cloud computing is certainly changing how people consume storage, how they use it, and what they use it for. It's also making people rethink how they architect their environment," stated Brad Winett, Senior Technologist for DDN Storage, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Detecting internal user threats in the Big Data eco-system is challenging and cumbersome. Many organizations monitor internal usage of the Big Data eco-system using a set of alerts. This is not a scalable process given the increase in the number of alerts with the accelerating growth in data volume and user base. Organizations are increasingly leveraging machine learning to monitor only those data elements that are sensitive and critical, autonomously establish monitoring policies, and to detect...
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settl...