Hadoop Kerberos安全性 [英] Hadoop Kerberos security

查看:1079
本文介绍了Hadoop Kerberos安全性的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我建立了单节点集群,并且kdc服务器以及客户端在同一台机器上。我尝试了所有可能的选项,但仍然存在相同的错误。
根据答案的建议,我进行了以下更改。
1)在$ JAVA_HOME / jre / lib / security文件夹中安装了JCE jar。
2)我编辑了krb5.conf文件以仅使用aes256-cts加密。



/etc/krb5.conf如下所示,


$ b $

  [logging] 
default = FILE:/var/log/krb5libs.log
kdc = FILE:/ var / log / krb5kdc.log
admin_server = FILE:/var/log/kadmind.log

[libdefaults]
dns_lookup_realm = false
ticket_lifetime = 24h
renew_lifetime = 7d
forwardable = true
rdns = false
default_realm = EXAMPLE.COM
default_ccache_name = KEYRING:持久性:%{uid}
default_tkt_enctypes = aes256-cts
default_tgs_enctypes = aes256-cts
allowed_enctypes = aes256-cts
[realms]
EXAMPLE.COM = {
kdc = localhost
admin_server = localhost
}

[domain_realm]
localhost = EXAMPLE.COM

/var/kerberos/krb5kdc/kdc.conf如下所示

  [kdcdefaults] 
kdc_ports = 88
kdc_tcp_ports = 88

[realms]
EXAMPLE.COM = {
#master_key_type = aes256-cts
acl_file = /var/kerberos/krb5kdc/kadm5.acl
dict_file = / usr / share / dict / words
admin_keytab = /var/kerberos/krb5kdc/kadm5.keytab
supported_enctypes = aes256-cts:normal aes128-cts:normal des3-hmac-sha1 :normal arcfour-hmac:normal camellia256-cts:normal camellia128-cts:normal des-hmac-sha1:normal des-cbc-md5:normal des-cbc-crc:normal
max_life = 24h 0m 0s
max_renewable_life = 7d 0h 0m 0s
}

namenode和datanode启动在keytab文件中提供的凭证。在namenode和datanode启动后,我创建了一个已经是hadoop组中的unix用户的主体,即使用addprinc命令的'hdfs'。然后我用kinit命令(kinit hadoop),这是成功的。 klist -e命令结果显示,按预期方式,enc类型为aes-256。但是当我尝试使用hadoop fs -ls /命令时,我得到了以下错误。

Java配置名称:null

本地配置名称:/ etc / krb5.conf

从本地配置加载

KinitOptions缓存名称是/ tmp / krb5cc_1001
15/06/26 13:20:18 WARN ipc.Client:遇到异常连接到服务器时:javax.security.sasl.SaslException:GSS启动失败[由GSSException引起:未提供有效的凭据(机制级别:未能找到任何Kerberos tgt)]
ls:本地失败异常:java.io.IOException:javax.security.sasl.SaslException:GSS启动失败[由GSSException引起:没有提供有效的凭据(机制级别:未能找到任何Kerberos tgt)];主机详细信息:本地主机是:/;目的地主机是::9000;



需要帮助。

解决方案

错误的原因是已经在消息中:您的配置列表default_ccache_name = KEYRING:persistent:%{uid}它将凭据存储在Linux上的安全内核缓冲区中。 Java无法读取此缓冲区,因此您将收到错误消息。



您需要将其设置为:


default_ccache_name = / tmp / krb5cc _%{uid}


KRB5CCNAME


I have set up single node cluster and the kdc server as well as the clients are on the same machine. I tried all the possible options but still the same error persists. From the research i have made following changes as suggested by the answers. 1) Installed JCE jars in $JAVA_HOME/jre/lib/security folder. 2) I edited the krb5.conf file to use only aes256-cts encryption.

/etc/krb5.conf looks like below,

[logging]
 default = FILE:/var/log/krb5libs.log  
 kdc = FILE:/var/log/krb5kdc.log  
 admin_server = FILE:/var/log/kadmind.log

[libdefaults]
 dns_lookup_realm = false  
 ticket_lifetime = 24h  
 renew_lifetime = 7d  
 forwardable = true  
 rdns = false  
 default_realm = EXAMPLE.COM  
 default_ccache_name = KEYRING:persistent:%{uid}  
 default_tkt_enctypes = aes256-cts  
 default_tgs_enctypes = aes256-cts  
 permitted_enctypes   = aes256-cts  
[realms]  
 EXAMPLE.COM = {
  kdc = localhost  
  admin_server = localhost  
 }  

[domain_realm]  
 localhost = EXAMPLE.COM  

/var/kerberos/krb5kdc/kdc.conf looks like below

[kdcdefaults]  
 kdc_ports = 88  
 kdc_tcp_ports = 88  

[realms]  
 EXAMPLE.COM = {  
  #master_key_type = aes256-cts  
  acl_file = /var/kerberos/krb5kdc/kadm5.acl  
  dict_file = /usr/share/dict/words  
  admin_keytab = /var/kerberos/krb5kdc/kadm5.keytab  
  supported_enctypes = aes256-cts:normal aes128-cts:normal des3-hmac-sha1:normal arcfour-hmac:normal camellia256-cts:normal camellia128-cts:normal   des-hmac-sha1:normal des-cbc-md5:normal des-cbc-crc:normal  
      max_life = 24h 0m 0s  
      max_renewable_life = 7d 0h 0m 0s  
}  

The namenode and datanode start-up with the credentials that have been provided in the keytab file.After namenode and datanode started i created a principal which is already a unix user in hadoop group, namely 'hdfs', with addprinc command. Then i used kinit command (kinit hadoop), which was succesful. The klist -e command results show that the enc type is aes-256 as expected. But when i try a hadoop fs -ls / command i get below error.

Java config name: null
Native config name: /etc/krb5.conf
Loaded from native config
KinitOptions cache name is /tmp/krb5cc_1001 15/06/26 13:20:18 WARN ipc.Client: Exception encountered while connecting to the server : javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
ls: Failed on local exception: java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]; Host Details : local host is: "/"; destination host is: "":9000;

Help needed please.

解决方案

The reason for the error is already in the message: your configuration lists default_ccache_name = KEYRING:persistent:%{uid} which stores credentials in a secure kernel buffer on linux. Java is not able to read this buffer and thus you will get an error.

You will need to set this to something like:

default_ccache_name = /tmp/krb5cc_%{uid}

or overwrite it with KRB5CCNAME

这篇关于Hadoop Kerberos安全性的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆