Welcome!

@BigDataExpo Authors: Scott Millis, Elizabeth White, Liz McMillan, Pat Romanski, Dana Gardner

Related Topics: @BigDataExpo, Java IoT, Microservices Expo, @CloudExpo, Apache, Cloud Security

@BigDataExpo: Blog Feed Post

How to Secure Hadoop Without Touching It

Combining API Security and Hadoop

It sounds like a parlor trick, but one of the benefits of API centric de-facto standards  such as REST and JSON is they allow relatively seamless communication between software systems.

This makes it possible to combine technologies to instantly bring out new capabilities. In particular I want to talk about how an API Gateway can improve the security posture of a Hadoop installation without having to actually modify Hadoop itself. Sounds too good to be true? Read on.

Hadoop and RESTful APIs
Hadoop is mostly a behind the firewall affair, and APIs are generally used for exposing data or capabilities for other systems, users or mobile devices. In the case of Hadoop there are three main RESTful APIs to talk about. This list isn’t exhaustive but it covers the main APIs.

  1. WebHDFS – Offers complete control over files and directories in HDFS
  2. HBase REST API – Offers access to insert, create, delete, single/multiple cell values
  3. HCatalog REST API – Provides job control for Map/Reduce, Pig and Hive as well as to access and manipulate HCatalog DDL data

These APIs are very useful because anyone with an HTTP client can potentially manipulate data in Hadoop. This, of course, is like using a knife all-blade – it’s very easy to cut yourself. To take an example, WebHDFS allows RESTful calls for directory listings, creating new directories and files, as well as file deletion. Worse,  the default security model requires nothing more than inserting “root” into the HTTP call.

To its credit, most distributions of Hadoop also offer Kerberos SPNEGO authentication, but additional work is needed to support other types of authentication and authorization schemes, and not all REST calls that expose sensitive data (such as a list of files) are secured. Here are some of the other challenges:

  • Fragmented Enforcement – Some REST calls leak information and require no credentials
  • Developer Centric Interfaces – Full Java stack traces are passed back to callers, leaking system details
  • Resource Protection – The Namenode is a single point of failure and excessive WebHDFS activity may threaten the cluster
  • Consistent Security Policy – All APIs in Hadoop must be independently configured, managed and audited over time

This list is just a start, and to be fair, Hadoop is still evolving. We expect things to get better over time, but for Enterprises to unlock value from their “Big Data” projects now, they can’t afford to wait until security is perfect.

One model used in other domains is an API Gateway or proxy that sits between the Hadoop cluster and the client. Using this model, the cluster only trusts calls from the gateway and all potential API callers are forced to use the gateway. Further, the gateway capabilities are rich enough and expressive enough to perform the full depth and breadth of security for REST calls from authentication to message level security, tokenization, throttling, denial of service protection, attack protection and data translation. Even better, this provides a safe and effective way to expose Hadoop to mobile devices without worrying about performance, scalability and security.  Here is the conceptual picture:

Intel Expressway API Manager and Intel Distribution of Apache Hadoop

In the previous diagram we are showing the Intel(R) Expressway API Manager acting as a proxy for WebHDFS, HBase and HCatalog APIs exposed from Intel’s Hadoop distribution. API Manager exposes RESTful APIs and also provides an out of the box subscription to Mashery to help evangelize APIs among a community of developers.

All of the policy enforcement is done at the HTTP layer by the gateway and the security administrator is free to rewrite the API to be more user friendly to the caller and the gateway will take care of mapping and rewriting the REST call to the format supported by Hadoop. In short, this model lets you provide instant Enterprise security for a good chunk of Hadoop capabilities without having to add a plug-in, additional code or a special distribution of Hadoop. So… just what can you do without touching Hadoop? To take WebHDFS as an example the following is possible with some configuration on the gateway itself:

  1. A gateway can lock-down the standard WebHDFS REST API and allow access only for specific users based on an Enterprise identity that may be stored in LDAP, Active Directory, Oracle, Siteminder, IBM or Relational Databases.
  2. A gateway provides additional authentication methods such as X.509 certificates with CRL and OCSP checking, OAuth token handling, API keys support, WS-Security and SSL termination & acceleration for WebHDFS API calls. The gateway can expose secure versions of the WebDHFS API for external access
  3. A gateway can improve on the security model used by WebHDFS which carries identities in HTTP query parameters, which are more susceptible to credential leakage compared to a security model based on HTTP headers. The gateway can expose a variant of the WebHDFS API that expects credentials in the HTTP header and seamlessly maps this to the WebHDFS internal format
  4. The gateway workflow engine can maps a single function REST call into multiple WebHDFS calls. For example, the WebHDFS REST API requires two separate HTTP calls for file creation and file upload. The gateway can expose a single API for this that handles the sequential execution and error handling, exposing a single function to the user
  5. The gateway can strip and redact Java exception traces carried in the WebHDFS REST API responses ( for instance, JSON responses may carry org.apache.hadoop.security.AccessControlException.* which can spill details beneficial to an attacker
  6. The gateway can throttle and rate shape WebHDFS REST requests which can protect the Hadoop cluster from resource consumption from excessive HDFS writes, open file handles and excessive  create, read, update and delete operations which might impact a running job.

This list is just the start, API manager can also perform selective encryption and data protection (such as PCI tokenization or PII format preserving encryption) on data as it is inserted or deleted from the Hadoop cluster, all by sitting in-between the caller and the cluster. So the parlor trick here is really moving the problem from trying to secure hadoop from the inside out to moving and centralizing security to the enforcement point. If you are looking for a way to expose “Big Data” outside the cluster, an the API Gateway model may be worth some investigation!

Blake

 

The post How to secure Hadoop without touching it – combining API Security and Hadoop appeared first on Security [email protected].

Read the original blog entry...

More Stories By Application Security

This blog references our expert posts on application and web services security.

@BigDataExpo Stories
Between 2005 and 2020, data volumes will grow by a factor of 300 – enough data to stack CDs from the earth to the moon 162 times. This has come to be known as the ‘big data’ phenomenon. Unfortunately, traditional approaches to handling, storing and analyzing data aren’t adequate at this scale: they’re too costly, slow and physically cumbersome to keep up. Fortunately, in response a new breed of technology has emerged that is cheaper, faster and more scalable. Yet, in meeting these new needs they...
The IoT is changing the way enterprises conduct business. In his session at @ThingsExpo, Eric Hoffman, Vice President at EastBanc Technologies, discussed how businesses can gain an edge over competitors by empowering consumers to take control through IoT. He cited examples such as a Washington, D.C.-based sports club that leveraged IoT and the cloud to develop a comprehensive booking system. He also highlighted how IoT can revitalize and restore outdated business models, making them profitable ...
When it comes to cloud computing, the ability to turn massive amounts of compute cores on and off on demand sounds attractive to IT staff, who need to manage peaks and valleys in user activity. With cloud bursting, the majority of the data can stay on premises while tapping into compute from public cloud providers, reducing risk and minimizing need to move large files. In his session at 18th Cloud Expo, Scott Jeschonek, Director of Product Management at Avere Systems, discussed the IT and busin...
According to Forrester Research, every business will become either a digital predator or digital prey by 2020. To avoid demise, organizations must rapidly create new sources of value in their end-to-end customer experiences. True digital predators also must break down information and process silos and extend digital transformation initiatives to empower employees with the digital resources needed to win, serve, and retain customers.
SaaS companies can greatly expand revenue potential by pushing beyond their own borders. The challenge is how to do this without degrading service quality. In his session at 18th Cloud Expo, Adam Rogers, Managing Director at Anexia, discussed how IaaS providers with a global presence and both virtual and dedicated infrastructure can help companies expand their service footprint with low “go-to-market” costs.
"Once customers get a year into their IoT deployments, they start to realize that they may have been shortsighted in the ways they built out their deployment and the key thing I see a lot of people looking at is - how can I take equipment data, pull it back in an IoT solution and show it in a dashboard," stated Dave McCarthy, Director of Products at Bsquare Corporation, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
"We are the public cloud providers. We are currently providing 50% of the resources they need for doing e-commerce business in China and we are hosting about 60% of mobile gaming in China," explained Yi Zheng, CPO and VP of Engineering at CDS Global Cloud, in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Predictive analytics tools monitor, report, and troubleshoot in order to make proactive decisions about the health, performance, and utilization of storage. Most enterprises combine cloud and on-premise storage, resulting in blended environments of physical, virtual, cloud, and other platforms, which justifies more sophisticated storage analytics. In his session at 18th Cloud Expo, Peter McCallum, Vice President of Datacenter Solutions at FalconStor, discussed using predictive analytics to mon...
The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform and how we integrate our thinking to solve complicated problems. In his session at 19th Cloud Expo, Craig Sproule, CEO of Metavine, demonstrated how to move beyond today's coding paradigm and sh...
Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like “How is my application doing” but no id...
@GonzalezCarmen has been ranked the Number One Influencer and @ThingsExpo has been named the Number One Brand in the “M2M 2016: Top 100 Influencers and Brands” by Onalytica. Onalytica analyzed tweets over the last 6 months mentioning the keywords M2M OR “Machine to Machine.” They then identified the top 100 most influential brands and individuals leading the discussion on Twitter.
DevOps is being widely accepted (if not fully adopted) as essential in enterprise IT. But as Enterprise DevOps gains maturity, expands scope, and increases velocity, the need for data-driven decisions across teams becomes more acute. DevOps teams in any modern business must wrangle the ‘digital exhaust’ from the delivery toolchain, "pervasive" and "cognitive" computing, APIs and services, mobile devices and applications, the Internet of Things, and now even blockchain. In this power panel at @...
IoT is rapidly changing the way enterprises are using data to improve business decision-making. In order to derive business value, organizations must unlock insights from the data gathered and then act on these. In their session at @ThingsExpo, Eric Hoffman, Vice President at EastBanc Technologies, and Peter Shashkin, Head of Development Department at EastBanc Technologies, discussed how one organization leveraged IoT, cloud technology and data analysis to improve customer experiences and effici...
As data explodes in quantity, importance and from new sources, the need for managing and protecting data residing across physical, virtual, and cloud environments grow with it. Managing data includes protecting it, indexing and classifying it for true, long-term management, compliance and E-Discovery. Commvault can ensure this with a single pane of glass solution – whether in a private cloud, a Service Provider delivered public cloud or a hybrid cloud environment – across the heterogeneous enter...
All clouds are not equal. To succeed in a DevOps context, organizations should plan to develop/deploy apps across a choice of on-premise and public clouds simultaneously depending on the business needs. This is where the concept of the Lean Cloud comes in - resting on the idea that you often need to relocate your app modules over their life cycles for both innovation and operational efficiency in the cloud. In his session at @DevOpsSummit at19th Cloud Expo, Valentin (Val) Bercovici, CTO of Soli...
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, discussed the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
Successful digital transformation requires new organizational competencies and capabilities. Research tells us that the biggest impediment to successful transformation is human; consequently, the biggest enabler is a properly skilled and empowered workforce. In the digital age, new individual and collective competencies are required. In his session at 19th Cloud Expo, Bob Newhouse, CEO and founder of Agilitiv, drew together recent research and lessons learned from emerging and established compa...
Join Impiger for their featured webinar: ‘Cloud Computing: A Roadmap to Modern Software Delivery’ on November 10, 2016, at 12:00 pm CST. Very few companies have not experienced some impact to their IT delivery due to the evolution of cloud computing. This webinar is not about deciding whether you should entertain moving some or all of your IT to the cloud, but rather, a detailed look under the hood to help IT professionals understand how cloud adoption has evolved and what trends will impact th...
"IoT is going to be a huge industry with a lot of value for end users, for industries, for consumers, for manufacturers. How can we use cloud to effectively manage IoT applications," stated Ian Khan, Innovation & Marketing Manager at Solgeniakhela, in this SYS-CON.tv interview at @ThingsExpo, held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA.
Whether your IoT service is connecting cars, homes, appliances, wearable, cameras or other devices, one question hangs in the balance – how do you actually make money from this service? The ability to turn your IoT service into profit requires the ability to create a monetization strategy that is flexible, scalable and working for you in real-time. It must be a transparent, smoothly implemented strategy that all stakeholders – from customers to the board – will be able to understand and comprehe...