Apache Spark本机库 [英] Apache Spark Native Libraries
问题描述
我最近能够使用本机64位支持构建Apache Hadoop 2.5.1。所以,我摆脱了恼人的本地库警告。
我正在尝试配置Apache Spark。
14/09/14 18:48:42 WARN util.NativeCodeLoader :无法为您的平台加载native-hadoop库......在适用的情况下使用builtin-java类
一些提示:
我不得不下载预先构建的2.4版本的Spark,因为Hadoop 2.5在Maven中仍然没有配置文件。
以下导出被添加到spark-env.sh:
export HADOOP_CONF_DIR = / opt / hadoop -2.5.1 / etc / hadoop
export SPARK_LIBRARY_PATH = / opt / hadoop-2.5.1 / lib / native
没有使用spark-shell和spark-submit。我的Hadoop本地安装配置为伪分布式(ResourceManager + YARN支持)。
您应该添加 HADOOP_HOME / lib / native
到 LD_LIBRARY_PATH
:
export LD_LIBRARY_PATH = $ HADOOP_HOME / lib / native
I was recently able to build Apache Hadoop 2.5.1 with native 64 bit support. So, I got rid of the annoying Native Libraries Warning.
I'm trying to configure Apache Spark. When I start spark-shell, the same warning appears:
14/09/14 18:48:42 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Some tips:
I had to download a pre-built 2.4 version of Spark because there is still no profile for Hadoop 2.5 with Maven.
The following exports were added to spark-env.sh:
export HADOOP_CONF_DIR=/opt/hadoop-2.5.1/etc/hadoop
export SPARK_LIBRARY_PATH=/opt/hadoop-2.5.1/lib/native
Didn't work with spark-shell and spark-submit. My Hadoop local installation is configured as pseudo-distributed (ResourceManager + YARN support).
You should add HADOOP_HOME/lib/native
to the LD_LIBRARY_PATH
:
export LD_LIBRARY_PATH=$HADOOP_HOME/lib/native
这篇关于Apache Spark本机库的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!