仍在收到“无法从SCDynamicStore加载领域信息”错误修复后 [英] Still getting "Unable to load realm info from SCDynamicStore" after bug fix

查看:150
本文介绍了仍在收到“无法从SCDynamicStore加载领域信息”错误修复后的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我使用 brew install hadoop Hadoop Pig >和 brew install pig



我读过这里,您将获得无法从SCDynamicStore加载领域信息错误信息,除非你添加:

  export HADOOP_OPTS = -  Djava.security.krb5.realm = OX.AC .UK -Djava.security.krb5.kdc = kdc0.ox.ac.uk:kdc1.ox.ac.uk

添加到您的 hadoop-env.sh 文件中,我有。

然而,我运行 hadoop namenode -format ,我仍然看到:

  java [ 1548:1703]无法从输出中的SCDynamicStore 

加载领域信息。



任何人都知道为什么我仍然得到它?

正如dturnanski所说,你需要使用较旧的JDK。您可以通过将 JAVA_HOME 设置更改为 hadoop-env.sh 文件来设置此项:

  export JAVA_HOME =`/ usr / libexec / java_home -v 1.6` 

(请注意这里的严重引号。)这为我解决了问题。


I installed Hadoop and Pig using brew install hadoop and brew install pig.

I read here that you will to get Unable to load realm info from SCDynamicStore error message unless you add:

export HADOOP_OPTS="-Djava.security.krb5.realm=OX.AC.UK -Djava.security.krb5.kdc=kdc0.ox.ac.uk:kdc1.ox.ac.uk"

to your hadoop-env.sh file, which I have.

However, when I run hadoop namenode -format, I still see:

java[1548:1703] Unable to load realm info from SCDynamicStore

amongst the outputs.

Anyone know why I'm still getting it?

解决方案

As dturnanski suggests, you need to use an older JDK. You can set this in the hadoop-env.sh file by changing the JAVA_HOME setting to:

export JAVA_HOME=`/usr/libexec/java_home -v 1.6`

(Note the grave quotes here.) This fixed the problem for me.

这篇关于仍在收到“无法从SCDynamicStore加载领域信息”错误修复后的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆