Welcome!

@DXWorldExpo Authors: Automic Blog, Liz McMillan, Pat Romanski, Yeshim Deniz, Nate Vickery

Related Topics: @DXWorldExpo, @CloudExpo, Apache

@DXWorldExpo: Blog Feed Post

Big Data and Analytics By @MElRefaey | @BigDataExpo #BigData

Hadoop is a framework that simplifies the processing of data sets distributed across clusters of servers

This post is the first in a series of blog posts that will explore and exploit the Big Data and analytics tools. I will walk through easy steps to start working with such tools like Apache Hadoop, Pig, Mahout and solve some problems related to analytics and learning in the large scale by exploiting such tools, and shed the light on some of the challenges we face while working with these tools.

1. Apache Hadoop
1.1 Overview

Hadoop is a framework that simplifies the processing of data sets distributed across clusters of servers. Two of the main components of Hadoop are HDFS and MapReduce.HDFS is the file system that is used by Hadoop to store all the data. This file system spans across all the nodes that are being used by Hadoop. These nodes could be on a single server or they can be spread across a large number of servers.In this section, we will go through the instruction of how to get the Hadoop up and running with the configurations needed to make it useful for other components/frameworks that integrate or depends on Hadoop (e.g. Hive, Pig, HBase etc.).

Note: The installation will be a Pseudo distribution.

1.2 Tools and Versions
I've used the following tools and versions throughout this installation:

  • Ubuntu 14.04 LTS
  • Java 1.7.0_65 (java-7-openjdk-amd64)
  • Hadoop 2.5.1

1.3    Installation and Configurations

1. Install Java using the following command:

apt-get update apt-get install default-jdk

2. Create Security Keys using the following commands:

ssh-keygen -t rsa -P ' ' cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys

3. Download Hadoop tar file using:

wget http://www.webhostingreviewjam.com/mirror/apache/hadoop/common/hadoop-2.5.1/hadoop-2.5.1.tar.gz

4. Extract the tar file using:

tar -xzvf hadoop-2.5.1.tar.gz

5. Move the extracted files into a location you can easily recognize, and easily change the version used without much modifications using:

mv hadoop-2.5.1/ /usr/local/hadoop

6. Configure the following environment variables in the bashrc file (to make sure every time they are set with the machine sartup):

#HADOOP VARIABLES START export JAVA_HOME=/usr/lib/jvm/java-7-openjdk-amd64 export HADOOP_INSTALL=/usr/local/hadoop export PATH=$PATH:$HADOOP_INSTALL/bin export PATH=$PATH:$HADOOP_INSTALL/sbin export HADOOP_MAPRED_HOME=$HADOOP_INSTALL export HADOOP_COMMON_HOME=$HADOOP_INSTALL export HADOOP_HDFS_HOME=$HADOOP_INSTALL export YARN_HOME=$HADOOP_INSTALL export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_INSTALL/lib/native export HADOOP_OPTS="-Djava.library.path=$HADOOP_INSTALL/lib" #HADOOP VARIABLES END

7. Source the bashrc file after changes, for the system to recognize the changes using the following command:

source ~/.bashrc

8. Edit the Hadoop-env.sh using vim:

vim /usr/local/hadoop/etc/hadoop/hadoop-env.sh

The hadoop-env.sh file should look like this:


That will make the value of the JAVA_HOME always available to Hadoop whenever it starts.

9-      Edit the core-site.xml file using vim as well:

vim /usr/local/hadoop/etc/hadoop/core-site.xml
The file will look like:

10-   Edit the YARN file yarn-site.xml as follows:

vim /usr/local/hadoop/etc/hadoop/yarn-site.xml
The file will look like:

11. Create and edit the mapred-site.xml file:

vim /usr/local/hadoop/etc/hadoop/mapred-site.xml
The file will contains the following property, that specify which framework will be used for MapReduce:


12. Edit the hdfs-site.xml file, in order to specify the directories that will be used as datanode and namenode on that server.

vim /usr/local/hadoop/etc/hadoop/hdfs-site.xml
Create the two directories:  mkdir -p /usr/local/hadoop_store/hdfs/namenode                mkdir -p /usr/local/hadoop_store/hdfs/datanode
after editing the file, it will contains the following  properties:


13.Forma t the new Hadoop file system using the following command:

hdfs namenode -format
Note: This operation needs to be done once before we start using Hadoop. If it is executed again after Hadoop has been used, it'll destroy all the data on the Hadoop filesystem.

14. Now, all configurations are done, we can start using Hadoop, we should first run the following shell scripts:

start-dfs.sh                     start-yarn.sh
And to make sure everything is okay, and the right process is running, run the command jps and see the following:



15-   We can run MapReduce examples that exist in Hadoop bundle, but we need to run the following:

We should create the HDFS directories required to execute MapReduce jobs:
hdfs dfs -mkdir /user hdfs dfs -mkdir /user/mohamed
and copy the input files to be processed into the distributed filesystem:
hdfs dfs -put {here is the path to the files to be copied} input

16-   We can check the web console for the resource manager, HDFS nodes and running jobs as shown in the following screens:







Issues and problems:

I've experienced some issues related to: Ø  Formatting the HDFS, and I resolved it by changing permissions and ownership of the user who can format the namenode and datanode. Ø  Problem connecting to the resource manager, with the following error: ipc.Client: Retrying connect to server:

0.0.0.0/0.0.0.0:8032. Already tried 0 time(s); maxRetries=45
INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032 And I resolved it by: adding a few properties to yarn-site.xml :



We reached to the end of our first post on big data and analytics, hope you enjoyed reading it and experiminting with Hadoop installation and configuration. next post will be about Apache Pig.

Read the original blog entry...

More Stories By Mohamed El-Refaey

Work as head of research and development at EDC (Egypt Development Center) a member of NTG. previously worked for Qlayer, Acquired by (Sun Microsystems), when my passion about cloud computing domain started. with more than 10 years of experience in software design and development in e-commerce, BPM, EAI, Web 2.0, Banking applications, financial market, Java and J2EE. HIPAA, SOX, BPEL and SOA, and late two year focusing on virtualization technology and cloud computing in studies, technical papers and researches, and international group participation and events. I've been awarded in recognition of innovation and thought leadership while working as IT Specialist at EDS (an HP Company). Also a member of the Cloud Computing Interoperability Forum (CCIF) and member of the UCI (Unified Cloud Interface) open source project, in which he contributed with the project architecture.

@BigDataExpo Stories
"Evatronix provides design services to companies that need to integrate the IoT technology in their products but they don't necessarily have the expertise, knowledge and design team to do so," explained Adam Morawiec, VP of Business Development at Evatronix, in this SYS-CON.tv interview at @ThingsExpo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Smart cities have the potential to change our lives at so many levels for citizens: less pollution, reduced parking obstacles, better health, education and more energy savings. Real-time data streaming and the Internet of Things (IoT) possess the power to turn this vision into a reality. However, most organizations today are building their data infrastructure to focus solely on addressing immediate business needs vs. a platform capable of quickly adapting emerging technologies to address future ...
In his Opening Keynote at 21st Cloud Expo, John Considine, General Manager of IBM Cloud Infrastructure, led attendees through the exciting evolution of the cloud. He looked at this major disruption from the perspective of technology, business models, and what this means for enterprises of all sizes. John Considine is General Manager of Cloud Infrastructure Services at IBM. In that role he is responsible for leading IBM’s public cloud infrastructure including strategy, development, and offering m...
Sanjeev Sharma Joins June 5-7, 2018 @DevOpsSummit at @Cloud Expo New York Faculty. Sanjeev Sharma is an internationally known DevOps and Cloud Transformation thought leader, technology executive, and author. Sanjeev's industry experience includes tenures as CTO, Technical Sales leader, and Cloud Architect leader. As an IBM Distinguished Engineer, Sanjeev is recognized at the highest levels of IBM's core of technical leaders.
The 22nd International Cloud Expo | 1st DXWorld Expo has announced that its Call for Papers is open. Cloud Expo | DXWorld Expo, to be held June 5-7, 2018, at the Javits Center in New York, NY, brings together Cloud Computing, Digital Transformation, Big Data, Internet of Things, DevOps, Machine Learning and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding busin...
Digital transformation is about embracing digital technologies into a company's culture to better connect with its customers, automate processes, create better tools, enter new markets, etc. Such a transformation requires continuous orchestration across teams and an environment based on open collaboration and daily experiments. In his session at 21st Cloud Expo, Alex Casalboni, Technical (Cloud) Evangelist at Cloud Academy, explored and discussed the most urgent unsolved challenges to achieve f...
SYS-CON Events announced today that Synametrics Technologies will exhibit at SYS-CON's 22nd International Cloud Expo®, which will take place on June 5-7, 2018, at the Javits Center in New York, NY. Synametrics Technologies is a privately held company based in Plainsboro, New Jersey that has been providing solutions for the developer community since 1997. Based on the success of its initial product offerings such as WinSQL, Xeams, SynaMan and Syncrify, Synametrics continues to create and hone inn...
Digital Transformation (DX) is not a "one-size-fits all" strategy. Each organization needs to develop its own unique, long-term DX plan. It must do so by realizing that we now live in a data-driven age, and that technologies such as Cloud Computing, Big Data, the IoT, Cognitive Computing, and Blockchain are only tools. In her general session at 21st Cloud Expo, Rebecca Wanta explained how the strategy must focus on DX and include a commitment from top management to create great IT jobs, monitor ...
In a recent survey, Sumo Logic surveyed 1,500 customers who employ cloud services such as Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP). According to the survey, a quarter of the respondents have already deployed Docker containers and nearly as many (23 percent) are employing the AWS Lambda serverless computing framework. It’s clear: serverless is here to stay. The adoption does come with some needed changes, within both application development and operations. Tha...
Cloud Expo | DXWorld Expo have announced the conference tracks for Cloud Expo 2018. Cloud Expo will be held June 5-7, 2018, at the Javits Center in New York City, and November 6-8, 2018, at the Santa Clara Convention Center, Santa Clara, CA. Digital Transformation (DX) is a major focus with the introduction of DX Expo within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive ov...
A strange thing is happening along the way to the Internet of Things, namely far too many devices to work with and manage. It has become clear that we'll need much higher efficiency user experiences that can allow us to more easily and scalably work with the thousands of devices that will soon be in each of our lives. Enter the conversational interface revolution, combining bots we can literally talk with, gesture to, and even direct with our thoughts, with embedded artificial intelligence, whic...
Recently, REAN Cloud built a digital concierge for a North Carolina hospital that had observed that most patient call button questions were repetitive. In addition, the paper-based process used to measure patient health metrics was laborious, not in real-time and sometimes error-prone. In their session at 21st Cloud Expo, Sean Finnerty, Executive Director, Practice Lead, Health Care & Life Science at REAN Cloud, and Dr. S.P.T. Krishnan, Principal Architect at REAN Cloud, discussed how they built...
In his general session at 21st Cloud Expo, Greg Dumas, Calligo’s Vice President and G.M. of US operations, discussed the new Global Data Protection Regulation and how Calligo can help business stay compliant in digitally globalized world. Greg Dumas is Calligo's Vice President and G.M. of US operations. Calligo is an established service provider that provides an innovative platform for trusted cloud solutions. Calligo’s customers are typically most concerned about GDPR compliance, application p...
Continuous Delivery makes it possible to exploit findings of cognitive psychology and neuroscience to increase the productivity and happiness of our teams. In his session at 22nd Cloud Expo | DXWorld Expo, Daniel Jones, CTO of EngineerBetter, will answer: How can we improve willpower and decrease technical debt? Is the present bias real? How can we turn it to our advantage? Can you increase a team’s effective IQ? How do DevOps & Product Teams increase empathy, and what impact does empath...
SYS-CON Events announced today that Evatronix will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Evatronix SA offers comprehensive solutions in the design and implementation of electronic systems, in CAD / CAM deployment, and also is a designer and manufacturer of advanced 3D scanners for professional applications.
As many know, the first generation of Cloud Management Platform (CMP) solutions were designed for managing virtual infrastructure (IaaS) and traditional applications. But that's no longer enough to satisfy evolving and complex business requirements. In his session at 21st Cloud Expo, Scott Davis, Embotics CTO, explored how next-generation CMPs ensure organizations can manage cloud-native and microservice-based application architectures, while also facilitating agile DevOps methodology. He expla...
To get the most out of their data, successful companies are not focusing on queries and data lakes, they are actively integrating analytics into their operations with a data-first application development approach. Real-time adjustments to improve revenues, reduce costs, or mitigate risk rely on applications that minimize latency on a variety of data sources. In his session at @BigDataExpo, Jack Norris, Senior Vice President, Data and Applications at MapR Technologies, reviewed best practices to ...
DevOps promotes continuous improvement through a culture of collaboration. But in real terms, how do you: Integrate activities across diverse teams and services? Make objective decisions with system-wide visibility? Use feedback loops to enable learning and improvement? With technology insights and real-world examples, in his general session at @DevOpsSummit, at 21st Cloud Expo, Andi Mann, Chief Technology Advocate at Splunk, explored how leading organizations use data-driven DevOps to close th...
No hype cycles or predictions of a gazillion things here. IoT is here. You get it. You know your business and have great ideas for a business transformation strategy. What comes next? Time to make it happen. In his session at @ThingsExpo, Jay Mason, an Associate Partner of Analytics, IoT & Cybersecurity at M&S Consulting, presented a step-by-step plan to develop your technology implementation strategy. He also discussed the evaluation of communication standards and IoT messaging protocols, data...
"Digital transformation - what we knew about it in the past has been redefined. Automation is going to play such a huge role in that because the culture, the technology, and the business operations are being shifted now," stated Brian Boeggeman, VP of Alliances & Partnerships at Ayehu, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.