Security is an enterprise feature
At Hadoop Summit, Merv Adrian (VP Gartner) has shown data about Hadoop’s adoption in the enterprise space over the last 2 years and the numbers were great (actually they weren’t even good).
Hadoop vendors are becoming more aggressive in adding features that would make Hadoop enterprise ready. In some sectors (e.g. government, financial and health services) data security is regulated and this makes security features a top priority for adopting Hadoop in these spaces.
The state of Hadoop Security
There’s a mix of activity on the open source and vendor proprietary sides for addressing the void. There are some projects at incubation stage within Apache, or awaiting Apache approval, for providing LDAP/Active Directory linked gateways (Knox), data lifecycle policies (Falcon), and APIs for processor-based encryption (Rhino). There’s also an NSA-related project for adding fine-grained data security (Accumulo) based on Google BigTable constructs. And Hive Server 2 will add the LDAP/AD integration that’s current missing.
What’s interesting to note is that many big vendors have been focusing on adding proprietary security and auditing features to Hadoop.
Cloudera’s post introducing Sentry also provides a short overview of security in Hadoop, by looking at 4 areas:
- Perimeter security: network security, firewall, and Kerberos authentication
- Data security: encryption and masking currently available through a combination of recent work in the Hadoop community and vendor solutions.
- Access security: fine grained ACL
- Visibility: monitoring access and auditing
Sentry: Role-based Access Control for Hadoop
Cloudera has announced Sentry a fine grained role-based access control solution for Hadoop meant to simplify and augment the current course-grained HDFS-level authorization available in Hadoop.
Sentry comprises a core authorization provider and a binding layer. The core authorization provider contains a policy engine, which evaluates and validates security policies, and a policy provider, which is responsible for parsing the policy. The binding layer provides a pluggable interface that can be leveraged by a binding implementation to talk to the policy engine. (Note that the policy provider and the binding layer both provide pluggable interfaces.)
At this time, we have implemented a file-based provider that can understand a specific policy file format.
According to the post, right now only Impala and Hive have bindings for Sentry. This makes me wonder how Sentry is deployed in a Hadoop cluster so other layers could take advantage of the Sentry ACL. I see such a security feature implemented very close to HDFS so it would basically work with all types of access to data stored.
For more details about Sentry, read the official post With Sentry, Cloudera Fills Hadoop’s Enterprise Security Gap.
There are also numerous rewrites of the announcement:
- Rachel King for ZDNet: Cloudera intros new authorization module for Hadoop | ZDNet
- Virginia Backaitis for CMSWire: Cloudera Delivers Sentry Security For Hadoop: Regulated Enterprises Can Now Ask Big Data Questions
- Justin Lee for TheWhir: Cloudera Introduces New Authorization Module for Hadoop
- Isaac Lopez for Dataname: Cloudera Adds a Sentry to Their Stack - Datanami
- Jordan Novet for GigaOm: Cloudera keeps sensitive data hidden from prying eyes with new authorization settings — Tech News and Analysis
- Doug Henshen for InformationWeek: Cloudera Brings Role-Based Security To Hadoop - Software -
- Nick Kolakowski for Slashdot: Cloudera’s Sentry Offers Access Security for Big Data
Tony Baer is a principal analyst covering Big Data at Ovum. ↩
Original title and link: Hadoop Security and Cloudera’s new Role Based Access Control Sentry project ( ©myNoSQL)