在Docker-spark上,Hadoop“无法为您的平台加载native-hadoop库”错误? [英] Hadoop “Unable to load native-hadoop library for your platform” error on docker-spark?

查看:209
本文介绍了在Docker-spark上,Hadoop“无法为您的平台加载native-hadoop库”错误?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在使用 docker-spark 。开始 spark-shell 后,输出:

  15/05 / 21 04:28:22 DEBUG NativeCodeLoader:无法加载带有错误的native-hadoop:java.lang.UnsatisfiedLinkError:java.library.path中没有hadoop 
15/05/21 04:28:22 DEBUG NativeCodeLoader:java .library.path = / usr / java / packages / lib / amd64:/ usr / lib64:/ lib64:/ lib:/ usr / lib

spark容器的环境变量是:

  bash-4.1#export 
declare -x BOOTSTRAP =/ etc / bootstrap.sh
declare -x HADOOP_COMMON_HOME =/ usr / local / hadoop
声明-x HADOOP_CONF_DIR =/ usr / local / hadoop / etc / hadoop
declare -x HADOOP_HDFS_HOME =/ usr / local / hadoop
declare -x HADOOP_MAPRED_HOME =/ usr / local / hadoop
declare -x HADOOP_PREFIX =/ usr / local / hadoop
declare -x HADOOP_YARN_HOME =/ usr / local / hadoop
declare -x HOME =/
声明-x HOSTNAME =sandbox
declare -x JAVA_HOME =/ usr / java / default
declare -x OL DPWD
declare -x PATH =/ usr / local / sbin:/ usr / local / bin:/ usr / sbin:/ usr / bin:/ sbin:/ bin:/ usr / java / default / / usr / local / spark / bin:/ usr / local / hadoop / bin
declare -x PWD =/
declare -x SHLVL =3
declare -x SPARK_HOME =/ usr / local / spark
declare -x SPARK_JAR =hdfs:///spark/spark-assembly-1.3.0-hadoop2.4.0.jar
declare -x TERM = xterm
declare -x YARN_CONF_DIR =/ usr / local / hadoop / etc / hadoop

在引用 Hadoop之后无法在CentOS上为您的平台加载本机hadoop库错误,我已经完成了以下操作:



(1)检查 hadoop 库:

  bash-4.1#file / usr / local / hadoop / lib / native /libhadoop.so.1.1.0 
/usr/local/hadoop/lib/native/libhadoop.so.1.0.0:ELF 64位LSB共享对象,x86-64,版本1(SYSV),动态临客d,不剥离

是的,它是 64位

(2)尝试添加 HADOOP_OPTS 环境变量:

  export HADOOP_OPTS =$ HADOOP_OPTS -Djava.library.path = / usr / local / hadoop / lib / native

它不起作用,并报告相同的错误。



尝试添加 HADOOP_OPTS HADOOP_COMMON_LIB_NATIVE_DIR 环境变量:

  export HADOOP_COMMON_LIB_NATIVE_DIR = $ HADOOP_HOME / lib / native 
export HADOOP_OPTS = - Djava.library.path = $ HADOOP_HOME / lib
/ pre>

它仍然不起作用,并报告相同的错误。



任何人都可以提供一些线索关于这个问题?

解决方案

Hadoop 库添加到 LD_LIBRARY_PATH 修复此问题:

 导出LD_LIBRARY_PATH = / usr /本地/ hadoop / lib / native /:$ LD_LIBRARY_PATH 


I am using docker-spark. After starting spark-shell, it outputs:

15/05/21 04:28:22 DEBUG NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError:no hadoop in java.library.path
15/05/21 04:28:22 DEBUG NativeCodeLoader: java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib

The environment variables of this spark container are:

bash-4.1# export
declare -x BOOTSTRAP="/etc/bootstrap.sh"
declare -x HADOOP_COMMON_HOME="/usr/local/hadoop"
declare -x HADOOP_CONF_DIR="/usr/local/hadoop/etc/hadoop"
declare -x HADOOP_HDFS_HOME="/usr/local/hadoop"
declare -x HADOOP_MAPRED_HOME="/usr/local/hadoop"
declare -x HADOOP_PREFIX="/usr/local/hadoop"
declare -x HADOOP_YARN_HOME="/usr/local/hadoop"
declare -x HOME="/"
declare -x HOSTNAME="sandbox"
declare -x JAVA_HOME="/usr/java/default"
declare -x OLDPWD
declare -x PATH="/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/java/default/bin:/usr/local/spark/bin:/usr/local/hadoop/bin"
declare -x PWD="/"
declare -x SHLVL="3"
declare -x SPARK_HOME="/usr/local/spark"
declare -x SPARK_JAR="hdfs:///spark/spark-assembly-1.3.0-hadoop2.4.0.jar"
declare -x TERM="xterm"
declare -x YARN_CONF_DIR="/usr/local/hadoop/etc/hadoop"

After referring Hadoop "Unable to load native-hadoop library for your platform" error on CentOS, I have done the following:

(1) Check the hadoop library:

bash-4.1# file /usr/local/hadoop/lib/native/libhadoop.so.1.1.0
/usr/local/hadoop/lib/native/libhadoop.so.1.0.0: ELF 64-bit LSB shared object, x86-64, version 1 (SYSV), dynamically linked, not stripped

Yes, it is 64-bit library.

(2) Try adding the HADOOP_OPTS environment variable:

export HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=/usr/local/hadoop/lib/native"

It doesn't work, and reports the same error.

(3) Try adding the HADOOP_OPTS and HADOOP_COMMON_LIB_NATIVE_DIR environment variable:

export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"

It still doesn't work, and reports the same error.

Could anyone give some clues about the issue?

解决方案

Adding the Hadoop library into LD_LIBRARY_PATH fix this problem:

export LD_LIBRARY_PATH=/usr/local/hadoop/lib/native/:$LD_LIBRARY_PATH

这篇关于在Docker-spark上,Hadoop“无法为您的平台加载native-hadoop库”错误?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆