仍在收到“无法从SCDynamicStore加载领域信息”错误修复后 [英] Still getting "Unable to load realm info from SCDynamicStore" after bug fix
问题描述
我使用 brew install hadoop
Hadoop 和 Pig
>和 brew install pig
。
我读过这里,您将获得无法从SCDynamicStore加载领域信息
错误信息,除非你添加:
export HADOOP_OPTS = - Djava.security.krb5.realm = OX.AC .UK -Djava.security.krb5.kdc = kdc0.ox.ac.uk:kdc1.ox.ac.uk
添加到您的 hadoop-env.sh
文件中,我有。
然而,我运行 hadoop namenode -format
,我仍然看到:
java [ 1548:1703]无法从输出中的SCDynamicStore
加载领域信息。
任何人都知道为什么我仍然得到它?
正如dturnanski所说,你需要使用较旧的JDK。您可以通过将 JAVA_HOME
设置更改为 hadoop-env.sh
文件来设置此项: export JAVA_HOME =`/ usr / libexec / java_home -v 1.6`
(请注意这里的严重引号。)这为我解决了问题。
I installed Hadoop
and Pig
using brew install hadoop
and brew install pig
.
I read here that you will to get Unable to load realm info from SCDynamicStore
error message unless you add:
export HADOOP_OPTS="-Djava.security.krb5.realm=OX.AC.UK -Djava.security.krb5.kdc=kdc0.ox.ac.uk:kdc1.ox.ac.uk"
to your hadoop-env.sh
file, which I have.
However, when I run hadoop namenode -format
, I still see:
java[1548:1703] Unable to load realm info from SCDynamicStore
amongst the outputs.
Anyone know why I'm still getting it?
As dturnanski suggests, you need to use an older JDK. You can set this in the hadoop-env.sh
file by changing the JAVA_HOME
setting to:
export JAVA_HOME=`/usr/libexec/java_home -v 1.6`
(Note the grave quotes here.) This fixed the problem for me.
这篇关于仍在收到“无法从SCDynamicStore加载领域信息”错误修复后的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!