|By Application Security||
|March 2, 2013 09:00 AM EST||
It sounds like a parlor trick, but one of the benefits of API centric de-facto standards such as REST and JSON is they allow relatively seamless communication between software systems.
This makes it possible to combine technologies to instantly bring out new capabilities. In particular I want to talk about how an API Gateway can improve the security posture of a Hadoop installation without having to actually modify Hadoop itself. Sounds too good to be true? Read on.
Hadoop and RESTful APIs
Hadoop is mostly a behind the firewall affair, and APIs are generally used for exposing data or capabilities for other systems, users or mobile devices. In the case of Hadoop there are three main RESTful APIs to talk about. This list isn’t exhaustive but it covers the main APIs.
- WebHDFS – Offers complete control over files and directories in HDFS
- HBase REST API – Offers access to insert, create, delete, single/multiple cell values
- HCatalog REST API – Provides job control for Map/Reduce, Pig and Hive as well as to access and manipulate HCatalog DDL data
These APIs are very useful because anyone with an HTTP client can potentially manipulate data in Hadoop. This, of course, is like using a knife all-blade – it’s very easy to cut yourself. To take an example, WebHDFS allows RESTful calls for directory listings, creating new directories and files, as well as file deletion. Worse, the default security model requires nothing more than inserting “root” into the HTTP call.
To its credit, most distributions of Hadoop also offer Kerberos SPNEGO authentication, but additional work is needed to support other types of authentication and authorization schemes, and not all REST calls that expose sensitive data (such as a list of files) are secured. Here are some of the other challenges:
- Fragmented Enforcement – Some REST calls leak information and require no credentials
- Developer Centric Interfaces – Full Java stack traces are passed back to callers, leaking system details
- Resource Protection – The Namenode is a single point of failure and excessive WebHDFS activity may threaten the cluster
- Consistent Security Policy – All APIs in Hadoop must be independently configured, managed and audited over time
This list is just a start, and to be fair, Hadoop is still evolving. We expect things to get better over time, but for Enterprises to unlock value from their “Big Data” projects now, they can’t afford to wait until security is perfect.
One model used in other domains is an API Gateway or proxy that sits between the Hadoop cluster and the client. Using this model, the cluster only trusts calls from the gateway and all potential API callers are forced to use the gateway. Further, the gateway capabilities are rich enough and expressive enough to perform the full depth and breadth of security for REST calls from authentication to message level security, tokenization, throttling, denial of service protection, attack protection and data translation. Even better, this provides a safe and effective way to expose Hadoop to mobile devices without worrying about performance, scalability and security. Here is the conceptual picture:
Intel Expressway API Manager and Intel Distribution of Apache Hadoop
In the previous diagram we are showing the Intel(R) Expressway API Manager acting as a proxy for WebHDFS, HBase and HCatalog APIs exposed from Intel’s Hadoop distribution. API Manager exposes RESTful APIs and also provides an out of the box subscription to Mashery to help evangelize APIs among a community of developers.
All of the policy enforcement is done at the HTTP layer by the gateway and the security administrator is free to rewrite the API to be more user friendly to the caller and the gateway will take care of mapping and rewriting the REST call to the format supported by Hadoop. In short, this model lets you provide instant Enterprise security for a good chunk of Hadoop capabilities without having to add a plug-in, additional code or a special distribution of Hadoop. So… just what can you do without touching Hadoop? To take WebHDFS as an example the following is possible with some configuration on the gateway itself:
- A gateway can lock-down the standard WebHDFS REST API and allow access only for specific users based on an Enterprise identity that may be stored in LDAP, Active Directory, Oracle, Siteminder, IBM or Relational Databases.
- A gateway provides additional authentication methods such as X.509 certificates with CRL and OCSP checking, OAuth token handling, API keys support, WS-Security and SSL termination & acceleration for WebHDFS API calls. The gateway can expose secure versions of the WebDHFS API for external access
- A gateway can improve on the security model used by WebHDFS which carries identities in HTTP query parameters, which are more susceptible to credential leakage compared to a security model based on HTTP headers. The gateway can expose a variant of the WebHDFS API that expects credentials in the HTTP header and seamlessly maps this to the WebHDFS internal format
- The gateway workflow engine can maps a single function REST call into multiple WebHDFS calls. For example, the WebHDFS REST API requires two separate HTTP calls for file creation and file upload. The gateway can expose a single API for this that handles the sequential execution and error handling, exposing a single function to the user
- The gateway can strip and redact Java exception traces carried in the WebHDFS REST API responses ( for instance, JSON responses may carry org.apache.hadoop.security.AccessControlException.* which can spill details beneficial to an attacker
- The gateway can throttle and rate shape WebHDFS REST requests which can protect the Hadoop cluster from resource consumption from excessive HDFS writes, open file handles and excessive create, read, update and delete operations which might impact a running job.
This list is just the start, API manager can also perform selective encryption and data protection (such as PCI tokenization or PII format preserving encryption) on data as it is inserted or deleted from the Hadoop cluster, all by sitting in-between the caller and the cluster. So the parlor trick here is really moving the problem from trying to secure hadoop from the inside out to moving and centralizing security to the enforcement point. If you are looking for a way to expose “Big Data” outside the cluster, an the API Gateway model may be worth some investigation!
The post How to secure Hadoop without touching it – combining API Security and Hadoop appeared first on Security Gateways@Intel.
One of the biggest impacts of the Internet of Things is and will continue to be on data; specifically data volume, management and usage. Companies are scrambling to adapt to this new and unpredictable data reality with legacy infrastructure that cannot handle the speed and volume of data. In his session at @ThingsExpo, Don DeLoach, CEO and president of Infobright, will discuss how companies need to rethink their data infrastructure to participate in the IoT, including: Data storage: Understand...
Mar. 4, 2015 05:00 AM EST Reads: 2,656
The Workspace-as-a-Service (WaaS) market will grow to $6.4B by 2018. In his session at 16th Cloud Expo, Seth Bostock, CEO of IndependenceIT, will begin by walking the audience through the evolution of Workspace as-a-Service, where it is now vs. where it going. To look beyond the desktop we must understand exactly what WaaS is, who the users are, and where it is going in the future. IT departments, ISVs and service providers must look to workflow and automation capabilities to adapt to growing ...
Mar. 4, 2015 04:00 AM EST Reads: 1,098
The Internet of Things (IoT) promises to evolve the way the world does business; however, understanding how to apply it to your company can be a mystery. Most people struggle with understanding the potential business uses or tend to get caught up in the technology, resulting in solutions that fail to meet even minimum business goals. In his session at @ThingsExpo, Jesse Shiah, CEO / President / Co-Founder of AgilePoint Inc., showed what is needed to leverage the IoT to transform your business. ...
Mar. 4, 2015 02:45 AM EST Reads: 3,822
Hadoop as a Service (as offered by handful of niche vendors now) is a cloud computing solution that makes medium and large-scale data processing accessible, easy, fast and inexpensive. In his session at Big Data Expo, Kumar Ramamurthy, Vice President and Chief Technologist, EIM & Big Data, at Virtusa, will discuss how this is achieved by eliminating the operational challenges of running Hadoop, so one can focus on business growth. The fragmented Hadoop distribution world and various PaaS soluti...
Mar. 4, 2015 02:30 AM EST Reads: 1,185
The true value of the Internet of Things (IoT) lies not just in the data, but through the services that protect the data, perform the analysis and present findings in a usable way. With many IoT elements rooted in traditional IT components, Big Data and IoT isn’t just a play for enterprise. In fact, the IoT presents SMBs with the prospect of launching entirely new activities and exploring innovative areas. CompTIA research identifies several areas where IoT is expected to have the greatest impac...
Mar. 4, 2015 02:00 AM EST Reads: 3,127
In his session at DevOps Summit, Tapabrata Pal, Director of Enterprise Architecture at Capital One, will tell a story about how Capital One has embraced Agile and DevOps Security practices across the Enterprise – driven by Enterprise Architecture; bringing in Development, Operations and Information Security organizations together. Capital Ones DevOpsSec practice is based upon three "pillars" – Shift-Left, Automate Everything, Dashboard Everything. Within about three years, from 100% waterfall, C...
Mar. 4, 2015 01:00 AM EST Reads: 4,424
Disruptive macro trends in technology are impacting and dramatically changing the "art of the possible" relative to supply chain management practices through the innovative use of IoT, cloud, machine learning and Big Data to enable connected ecosystems of engagement. Enterprise informatics can now move beyond point solutions that merely monitor the past and implement integrated enterprise fabrics that enable end-to-end supply chain visibility to improve customer service delivery and optimize sup...
Mar. 4, 2015 12:30 AM EST Reads: 3,553
Even as cloud and managed services grow increasingly central to business strategy and performance, challenges remain. The biggest sticking point for companies seeking to capitalize on the cloud is data security. Keeping data safe is an issue in any computing environment, and it has been a focus since the earliest days of the cloud revolution. Understandably so: a lot can go wrong when you allow valuable information to live outside the firewall. Recent revelations about government snooping, along...
Mar. 3, 2015 11:15 PM EST Reads: 759
SYS-CON Events announced today that Dyn, the worldwide leader in Internet Performance, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Dyn is a cloud-based Internet Performance company. Dyn helps companies monitor, control, and optimize online infrastructure for an exceptional end-user experience. Through a world-class network and unrivaled, objective intelligence into Internet conditions, Dyn ensures...
Mar. 3, 2015 09:15 PM EST Reads: 896
As organizations shift toward IT-as-a-service models, the need for managing and protecting data residing across physical, virtual, and now cloud environments grows with it. CommVault can ensure protection &E-Discovery of your data – whether in a private cloud, a Service Provider delivered public cloud, or a hybrid cloud environment – across the heterogeneous enterprise. In his session at 16th Cloud Expo, Randy De Meno, Chief Technologist - Windows Products and Microsoft Partnerships, will disc...
Mar. 3, 2015 05:00 PM EST Reads: 968
Cloud data governance was previously an avoided function when cloud deployments were relatively small. With the rapid adoption in public cloud – both rogue and sanctioned, it’s not uncommon to find regulated data dumped into public cloud and unprotected. This is why enterprises and cloud providers alike need to embrace a cloud data governance function and map policies, processes and technology controls accordingly. In her session at 15th Cloud Expo, Evelyn de Souza, Data Privacy and Compliance...
Mar. 3, 2015 04:15 PM EST Reads: 931
Roberto Medrano, Executive Vice President at SOA Software, had reached 30,000 page views on his home page - http://RobertoMedrano.SYS-CON.com/ - on the SYS-CON family of online magazines, which includes Cloud Computing Journal, Internet of Things Journal, Big Data Journal, and SOA World Magazine. He is a recognized executive in the information technology fields of SOA, internet security, governance, and compliance. He has extensive experience with both start-ups and large companies, having been ...
Mar. 3, 2015 04:00 PM EST Reads: 1,446
There are many considerations when moving applications from on-premise to cloud. It is critical to understand the benefits and also challenges of this migration. A successful migration will result in lower Total Cost of Ownership, yet offer the same or higher level of robustness. In his session at 15th Cloud Expo, Michael Meiner, an Engineering Director at Oracle, Corporation, will analyze a range of cloud offerings (IaaS, PaaS, SaaS) and discuss the benefits/challenges of migrating to each of...
Mar. 3, 2015 04:00 PM EST Reads: 911
The industrial software market has treated data with the mentality of “collect everything now, worry about how to use it later.” We now find ourselves buried in data, with the pervasive connectivity of the (Industrial) Internet of Things only piling on more numbers. There’s too much data and not enough information. In his session at @ThingsExpo, Bob Gates, Global Marketing Director, GE’s Intelligent Platforms business, to discuss how realizing the power of IoT, software developers are now focu...
Mar. 3, 2015 03:15 PM EST Reads: 1,522
Operational Hadoop and the Lambda Architecture for Streaming Data Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing...
Mar. 3, 2015 02:00 PM EST Reads: 1,482
SYS-CON Events announced today that Vitria Technology, Inc. will exhibit at SYS-CON’s @ThingsExpo, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Vitria will showcase the company’s new IoT Analytics Platform through live demonstrations at booth #330. Vitria’s IoT Analytics Platform, fully integrated and powered by an operational intelligence engine, enables customers to rapidly build and operationalize advanced analytics to deliver timely business outcomes ...
Mar. 3, 2015 01:45 PM EST Reads: 1,371
Containers and microservices have become topics of intense interest throughout the cloud developer and enterprise IT communities. Accordingly, attendees at the upcoming 16th Cloud Expo at the Javits Center in New York June 9-11 will find fresh new content in a new track called PaaS | Containers & Microservices Containers are not being considered for the first time by the cloud community, but a current era of re-consideration has pushed them to the top of the cloud agenda. With the launch ...
Mar. 3, 2015 01:00 PM EST Reads: 1,004
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @Things...
Mar. 3, 2015 12:00 PM EST Reads: 1,457
Thanks to Docker, it becomes very easy to leverage containers to build, ship, and run any Linux application on any kind of infrastructure. Docker is particularly helpful for microservice architectures because their successful implementation relies on a fast, efficient deployment mechanism – which is precisely one of the features of Docker. Microservice architectures are therefore becoming more popular, and are increasingly seen as an interesting option even for smaller projects, instead of bein...
Mar. 3, 2015 12:00 PM EST Reads: 2,690
SYS-CON Media announced that IBM, which offers the world’s deepest portfolio of technologies and expertise that are transforming the future of work, has launched ad campaigns on SYS-CON’s numerous online magazines such as Cloud Computing Journal, Virtualization Journal, SOA World Magazine, and IoT Journal. IBM’s campaigns focus on vendors in the technology marketplace, the future of testing, Big Data and analytics, and mobile platforms.
Mar. 3, 2015 11:00 AM EST Reads: 1,087