Welcome!

Big Data Journal Authors: Pat Romanski, Elizabeth White, Adrian Bridgwater, Liz McMillan, Kevin Benedict

Related Topics: Big Data Journal

Big Data Journal: Blog Feed Post

EMC's Greenplum Unit Helps with Large Publicly Accessible Hadoop Cluster

Need to test your Hadoop app on a thousand nodes? Here’s how.

It isn’t often that you can get access to a thousand-node network to test your latest app, but thanks to the efforts of EMC’s Greenplum unit and some additional computing vendors, you can, and more amazingly, it is free of charge too.

The network was announced last fall at Strata and connects 1,000 specialized servers from Supermicro running dual Intel Xeon processors with 48 GB of RAM apiece along with Mellanox 10 GB Ethernet adapters and switches, and a total of 12,000 Seagate 2 TB drives. It is all contained within Greenplum’s Las Vegas data center, with the goal of having the largest publicly accessible Hadoop cluster around. While Yahoo and eBay and others have some fairly large Hadoop clusters, they generally don’t let anyone else come in and try out their apps. The cluster goes under the name of Analytics Workbench. On this page, you can click on the “learn more” button and submit your name if you are interested in using the cluster.

The goal, according to Greenplum staffers, is to have a community and collaborative big data platform that can be applied to a set of analytical problems that have wide appeal. When the Strata announcement was made last fall, Greenplum stated that they wanted to eventually publish any results from the cluster, but they haven’t yet. Intel was one of the first clients to use the workbench (and running a thousand-node job too), but they are still reviewing their results.

Other clients that are running tests on the cluster include Mellanox and VMware, who both donated gear to power it, and a research team from the University of Central Florida. A group from NASA Goddard is using it to perform an analysis of historical weather patterns. The cluster formally opened up in July, and yes, it is really is free of charge. Applicants need to be vetted and work closely with the Greenplum engineers to get their apps uploaded and configured to the cluster.

“We accept bids based on any submitted application and developers can request specific time and resources,” says William Davis, one of the Greenplum product marketers involved with the cluster’s creation. Applications are reviewed by an internal group of Hadoop experts called the Jedi Council, and they try to select who will have the best fit for the next test run on the cluster.

Greenplum intends to use the cluster in a variety of ways besides public testing. Sometime next quarter they will launch a training program for Hadoop. A unique aspect of the program is that each member of the course will be granted access to the cluster to use as a sandbox environment for their own project. They are still working out the details on how this will work. The company has other fee-based programs to leverage its experience with this cluster, including what it calls its Analytics Lab packages. This uses their team of data scientists on specific vertical markets or particular custom applications.

There are several other tools that are offered on the cluster in addition to Hadoop including MapReduce, the parallel job processing software; VMware’s Rubicon system management team; and standard Hadoop add-ons such as Hive, Pig, and Mahout.

Greenplum isn’t the first to have such a large test bed assembled, but probably the first to use this level of gear for Hadoop and other data science activities. In the late 1980s, a group of Novell engineers in Utah created the “SuperLab” which eventually grew to 1,700 PCs connected together. The lab was used to prove the features and scalability of Novell’s Netware network operating system, a piece of software that at one time could be found in most enterprises but now is largely a historical curiosity. Just to give you some perspective, in 1999 the PCs in Novell’s lab had a whopping 256 MB of RAM and 8 GB of storage (try buying that on today’s PCs). How times have changed.

Anyway, the SuperLab team left Novell a few years later and built their own private test lab for a startup called Keylabs. I was one of their early customers, using the facility to publish some of the test results in cNet and other IT publications of the first Web server comparison tests.

The Keylabs engineers very quickly discovered that automating the sequencing and actions of the individual PCs was tedious, and they wrote software that eventually spawned Altiris. Part of the assets of this company was later purchased by Symantec and is still used for their desktop imaging and management tool line.

Speaking of scaling up to a thousand machines automatically, running tests on this scale can be tricky. Greenplum has already seen several hardware failures that take down particular nodes as they have begun using their cluster. And like Keylabs, understanding how to sequence all this gear to come online quickly can be vexing: imagine if each machine takes just ten minutes to boot up and launch an app: times ten or twenty nodes that isn’t much of a big deal, but when you are trying to bring up hundreds it could tie up the cluster for the better part of a week in just starting up the tests. “It is a bit of a challenge in educating our customers on how to use and manage something of this size and how to deploy their software across the entire cluster. You can’t deploy software serially, and we have to make sure that our customers understand these issues,” says Davis.

So get your application in now for testing your app. You could be making computing history.

More Stories By David Strom

David Strom is an international authority on network and Internet technologies. He has written extensively on the topic for 20 years for a wide variety of print publications and websites, such as The New York Times, TechTarget.com, PC Week/eWeek, Internet.com, Network World, Infoworld, Computerworld, Small Business Computing, Communications Week, Windows Sources, c|net and news.com, Web Review, Tom's Hardware, EETimes, and many others.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Latest Stories from Big Data Journal
Enthusiasm for the Internet of Things has reached an all-time high. In 2013 alone, venture capitalists spent more than $1 billion dollars investing in the IoT space. With “smart” appliances and devices, IoT covers wearable smart devices, cloud services to hardware companies. Nest, a Google company, detects temperatures inside homes and automatically adjusts it by tracking its user’s habit. These technologies are quickly developing and with it come challenges such as bridging infrastructure gaps,...
Predicted by Gartner to add $1.9 trillion to the global economy by 2020, the Internet of Everything (IoE) is based on the idea that devices, systems and services will connect in simple, transparent ways, enabling seamless interactions among devices across brands and sectors. As this vision unfolds, it is clear that no single company can accomplish the level of interoperability required to support the horizontal aspects of the IoE. The AllSeen Alliance, announced in December 2013, was formed wi...
Goodness there is a lot of talk about cloud computing. This ‘talk and chatter’ is part of the problem, i.e., we look at it, we prod it and we might even test it out – but do we get down to practical implementation, deployment and (if you happen to be a fan of the term) actual cloud ‘rollout’ today? Cloud offers the promise of a new era they say – and a new style of IT at that. But this again is the problem and we know that cloud can only deliver on the promises it makes if it is part of a well...
There’s Big Data, then there’s really Big Data from the Internet of Things. IoT is evolving to include many data possibilities like new types of event, log and network data. The volumes are enormous, generating tens of billions of logs per day, which raise data challenges. Early IoT deployments are relying heavily on both the cloud and managed service providers to navigate these challenges. In her session at 6th Big Data Expo®, Hannah Smalltree, Director at Treasure Data, to discuss how IoT, B...
SYS-CON Events announced today that Connected Data, the creator of Transporter, the world’s first peer-to-peer private cloud storage device, will exhibit at SYS-CON's 15th International Cloud Expo®, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. Connected Data is the creator of Transporter, the world’s first peer-to-peer private cloud storage device. Connected Data is focused on providing elegantly designed solutions for consumers, professi...
Cisco on Wedesday announced its intent to acquire privately held Metacloud. Based in Pasadena, Calif., Metacloud deploys and operates private clouds for global organizations with a unique OpenStack-as-a-Service model that delivers and remotely operates production-ready private clouds in a customer's data center. Metacloud's OpenStack-based cloud platform will accelerate Cisco's strategy to build the world's largest global Intercloud, a network of clouds, together with key partners to address cu...
I write and study often on the subject of digital transformation - the digital transformation of industries, markets, products, business models, etc. In brief, digital transformation is about the impact that collected and analyzed data can have when used to enhance business processes and workflows. If Amazon knows your preferences for particular books and films based upon captured data, then they can apply analytics to predict related books and films that you may like. This improves sales. T...
Technology is enabling a new approach to collecting and using data. This approach, commonly referred to as the “Internet of Things” (IoT), enables businesses to use real-time data from all sorts of things including machines, devices and sensors to make better decisions, improve customer service, and lower the risk in the creation of new revenue opportunities. In his session at Internet of @ThingsExpo, Dave Wagstaff, Vice President and Chief Architect at BSQUARE Corporation, will discuss the real...
IoT is still a vague buzzword for many people. In his session at Internet of @ThingsExpo, Mike Kavis, Vice President & Principal Cloud Architect at Cloud Technology Partners, will discuss the business value of IoT that goes far beyond the general public's perception that IoT is all about wearables and home consumer services. The presentation will also discuss how IoT is perceived by investors and how venture capitalist access this space. Other topics to discuss are barriers to success, what is n...
When one expects instantaneous response from video generated on the internet, lots of invisible problems have to be overcome. In his session at 6th Big Data Expo®, Tom Paquin, EVP and Chief Technology Officer at OnLive, to discuss how to overcome these problems. A Silicon Valley veteran, Tom Paquin provides vision, expertise and leadership to the technology research and development effort at OnLive as EVP and Chief Technology Officer. With more than 20 years of management experience at lead...