Welcome!

@DXWorldExpo Authors: Yeshim Deniz, Liz McMillan, Pat Romanski, Elizabeth White, Ed Featherston

Related Topics: @DXWorldExpo, @CloudExpo, Apache

@DXWorldExpo: Article

Analytics in Decision-Making Workflow | @CloudExpo #BigData #Microservices

Big Data shouldn’t be restricted to data scientists

Putting Analytics into the Decision-Making Workflow with Apache Spark

Data-driven businesses use analytics to inform and support their decisions. In many companies, marketing, sales, finance, and operations departments tend to be the earliest adopters of data analytics, with the rest of the business lagging behind. The goal for many organizations now is to make analytics a natural part of most-if not every-employee's daily workflow. Achieving that objective typically requires a shift in the corporate culture, and ready access to user-friendly data analytics tools.

Big Data Shouldn't Be Restricted to Data Scientists
Big Data experts, when discussing the process of integrating data analysis into the workflow across an enterprise, often talk blithely about how users can easily leverage their SQL skills to query data. The problem is that not everyone has SQL skills-or even knows what SQL is.

Companies who plan to transform themselves into data-driven, lean businesses may want to consider the fact that every employee really doesn't need to be a data scientist. Focus the majority of training efforts (including how to run basic SQL queries, if necessary) on the employees whose jobs involve fact-based decision-making.

Making employees wait for IT to manage schemas and setup ETL tasks is counter-productive. In a busy company, by the time data is prepped for analysis, it may have lost some of its actionable relevance. Instead, provide robust self-service data analysis tools, such as Apache Drill, to enable users to extract the most value possible from data stored in Hadoop. This frees employees to work with data in native formats-schema-less data, nested data, and data with rapidly-evolving schemas-with limited to no IT involvement.

Self-service data tools also enable explorative queries. Users can explore the data directly and extend their analysis effortlessly, with no need to wait for IT to prep additional data sets. Analysis can then extend past known, structured data, to semi-structured and unstructured data, such as call center logs, videos, spreadsheets, social media data, clickstream data, web log files, and external data (such as publicly available industry data)-allowing a business to gain big picture, actionable insights on the fly.

Apache Spark: Bringing New Efficiencies to Big Data Analysis
Agile companies that rely on data analysis performed in near-time and real-time also need solutions that can rapidly process large data sets. Apache Spark, an in-memory data processing framework, is increasingly the solution of choice.

Spark is a framework providing parallel, distributed data processing. Spark can be deployed through Apache Hadoop via Yarn, Apache Mesos, or its own standalone cluster manager. It can serve as a foundation for other data processing frameworks, and supports programming languages including Scala, Java, and Python. Data can be accessed in HDFS, Cassandra, HBase, Hive, Tachyon, and any Hadoop data source.

Data sets can be pinned in memory with Spark, which boosts application performance noticeably. Spark also provides speed improvements for applications running on disk and enables MapReduce to support interactive queries and stream processing far more efficiently.

And Spark eliminates the need for separate, distributed systems to process, for example, batch applications, interactive queries, iterative algorithms, and/or streaming. With Spark, all of these processing types are supported by the same engine, reducing management chores and making the processes easier to combine.

Businesses can count on Spark's benefits over the long-term. Spark, initially conceived as a project at UC Berkeley in California, moved to the Apache Software Foundation in 2013 and became a top level project in 2014. Apache top level projects, which include Hadoop, Spark, and httpd, is a designation that indicates a project has strong community backing from developers and users-and has proved its worth. More than 50 companies currently list themselves on Spark's "Powered By" page.

Putting Data-Driven Intelligence to Work
Big Data incarnates multiple processes-collection, cleansing, integration, management, governance, security, analysis, and decision-making-all of which need to be in place before a company can consider itself data-driven. Oddly, the decision-making process itself tends to get the least attention.

Gaining real ROI from a Big Data project requires more than fast tools and a solid plan to enable users to incorporate analysis-driven decision-making into their workflow. Quick discovery of exciting new insights in data has no benefit if a company doesn't have a process that enables an equally speedy and effective response to that new intelligence. When devising (or revising) your Big Data project, ensure that you build in an implementation process that enables analysis to be transformed into action.

And finally, a word of warning about real-time analysis: It's easy to lose sight of long-range goals when you're immersed in the moment. Ensure that business goals are aligned with data analysis activities, and establish KPIs to monitor the success of data-driven initiatives. Big Data should provide a company with a sustainable competitive edge.

To explore more of what Spark has to offer, jump over to Getting Started with Apache Spark: From Inception to Production, a free interactive ebook by James A. Scott.

More Stories By Jim Scott

Jim has held positions running Operations, Engineering, Architecture and QA teams in the Consumer Packaged Goods, Digital Advertising, Digital Mapping, Chemical and Pharmaceutical industries. Jim has built systems that handle more than 50 billion transactions per day and his work with high-throughput computing at Dow Chemical was a precursor to more standardized big data concepts like Hadoop.

@BigDataExpo Stories
DXWorldEXPO LLC, the producer of the world's most influential technology conferences and trade shows has announced the 22nd International CloudEXPO | DXWorldEXPO "Early Bird Registration" is now open. Register for Full Conference "Gold Pass" ▸ Here (Expo Hall ▸ Here)
Evan Kirstel is an internationally recognized thought leader and social media influencer in IoT (#1 in 2017), Cloud, Data Security (2016), Health Tech (#9 in 2017), Digital Health (#6 in 2016), B2B Marketing (#5 in 2015), AI, Smart Home, Digital (2017), IIoT (#1 in 2017) and Telecom/Wireless/5G. His connections are a "Who's Who" in these technologies, He is in the top 10 most mentioned/re-tweeted by CMOs and CIOs (2016) and have been recently named 5th most influential B2B marketeer in the US. H...
Charles Araujo is an industry analyst, internationally recognized authority on the Digital Enterprise and author of The Quantum Age of IT: Why Everything You Know About IT is About to Change. As Principal Analyst with Intellyx, he writes, speaks and advises organizations on how to navigate through this time of disruption. He is also the founder of The Institute for Digital Transformation and a sought after keynote speaker. He has been a regular contributor to both InformationWeek and CIO Insight...
Join IBM November 1 at 21st Cloud Expo at the Santa Clara Convention Center in Santa Clara, CA, and learn how IBM Watson can bring cognitive services and AI to intelligent, unmanned systems. Cognitive analysis impacts today’s systems with unparalleled ability that were previously available only to manned, back-end operations. Thanks to cloud processing, IBM Watson can bring cognitive services and AI to intelligent, unmanned systems. Imagine a robot vacuum that becomes your personal assistant tha...
I think DevOps is now a rambunctious teenager - it's starting to get a mind of its own, wanting to get its own things but it still needs some adult supervision," explained Thomas Hooker, VP of marketing at CollabNet, in this SYS-CON.tv interview at DevOps Summit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
"With Digital Experience Monitoring what used to be a simple visit to a web page has exploded into app on phones, data from social media feeds, competitive benchmarking - these are all components that are only available because of some type of digital asset," explained Leo Vasiliou, Director of Web Performance Engineering at Catchpoint Systems, in this SYS-CON.tv interview at DevOps Summit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
The “Digital Era” is forcing us to engage with new methods to build, operate and maintain applications. This transformation also implies an evolution to more and more intelligent applications to better engage with the customers, while creating significant market differentiators. In both cases, the cloud has become a key enabler to embrace this digital revolution. So, moving to the cloud is no longer the question; the new questions are HOW and WHEN. To make this equation even more complex, most ...
"This week we're really focusing on scalability, asset preservation and how do you back up to the cloud and in the cloud with object storage, which is really a new way of attacking dealing with your file, your blocked data, where you put it and how you access it," stated Jeff Greenwald, Senior Director of Market Development at HGST, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Creating replica copies to tolerate a certain number of failures is easy, but very expensive at cloud-scale. Conventional RAID has lower overhead, but it is limited in the number of failures it can tolerate. And the management is like herding cats (overseeing capacity, rebuilds, migrations, and degraded performance). In his general session at 18th Cloud Expo, Scott Cleland, Senior Director of Product Marketing for the HGST Cloud Infrastructure Business Unit, discussed how a new approach is neces...
"ZeroStack is a startup in Silicon Valley. We're solving a very interesting problem around bringing public cloud convenience with private cloud control for enterprises and mid-size companies," explained Kamesh Pemmaraju, VP of Product Management at ZeroStack, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Cloud-enabled transformation has evolved from cost saving measure to business innovation strategy -- one that combines the cloud with cognitive capabilities to drive market disruption. Learn how you can achieve the insight and agility you need to gain a competitive advantage. Industry-acclaimed CTO and cloud expert, Shankar Kalyana presents. Only the most exceptional IBMers are appointed with the rare distinction of IBM Fellow, the highest technical honor in the company. Shankar has also receive...
Agile has finally jumped the technology shark, expanding outside the software world. Enterprises are now increasingly adopting Agile practices across their organizations in order to successfully navigate the disruptive waters that threaten to drown them. In our quest for establishing change as a core competency in our organizations, this business-centric notion of Agile is an essential component of Agile Digital Transformation. In the years since the publication of the Agile Manifesto, the conn...
Business professionals no longer wonder if they'll migrate to the cloud; it's now a matter of when. The cloud environment has proved to be a major force in transitioning to an agile business model that enables quick decisions and fast implementation that solidify customer relationships. And when the cloud is combined with the power of cognitive computing, it drives innovation and transformation that achieves astounding competitive advantage.
"Software-defined storage is a big problem in this industry because so many people have different definitions as they see fit to use it," stated Peter McCallum, VP of Datacenter Solutions at FalconStor Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Deep learning has been very successful in social sciences and specially areas where there is a lot of data. Trading is another field that can be viewed as social science with a lot of data. With the advent of Deep Learning and Big Data technologies for efficient computation, we are finally able to use the same methods in investment management as we would in face recognition or in making chat-bots. In his session at 20th Cloud Expo, Gaurav Chakravorty, co-founder and Head of Strategy Development ...
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, discussed the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
As organizations shift towards IT-as-a-service models, the need for managing and protecting data residing across physical, virtual, and now cloud environments grows with it. Commvault can ensure protection, access and E-Discovery of your data – whether in a private cloud, a Service Provider delivered public cloud, or a hybrid cloud environment – across the heterogeneous enterprise. In his general session at 18th Cloud Expo, Randy De Meno, Chief Technologist - Windows Products and Microsoft Part...
"Cloud computing is certainly changing how people consume storage, how they use it, and what they use it for. It's also making people rethink how they architect their environment," stated Brad Winett, Senior Technologist for DDN Storage, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Detecting internal user threats in the Big Data eco-system is challenging and cumbersome. Many organizations monitor internal usage of the Big Data eco-system using a set of alerts. This is not a scalable process given the increase in the number of alerts with the accelerating growth in data volume and user base. Organizations are increasingly leveraging machine learning to monitor only those data elements that are sensitive and critical, autonomously establish monitoring policies, and to detect...
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settl...