Apache Spark 本机库 [英] Apache Spark Native Libraries
问题描述
我最近能够构建具有本机 64 位支持的 Apache Hadoop 2.5.1.所以,我摆脱了烦人的本地库警告.
I was recently able to build Apache Hadoop 2.5.1 with native 64 bit support. So, I got rid of the annoying Native Libraries Warning.
我正在尝试配置 Apache Spark.当我启动 spark-shell 时,出现同样的警告:
I'm trying to configure Apache Spark. When I start spark-shell, the same warning appears:
14/09/14 18:48:42 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
一些提示:
我不得不下载预构建的 2.4 版本的 Spark,因为仍然没有使用 Maven 的 Hadoop 2.5 配置文件.
I had to download a pre-built 2.4 version of Spark because there is still no profile for Hadoop 2.5 with Maven.
以下导出已添加到 spark-env.sh:
The following exports were added to spark-env.sh:
export HADOOP_CONF_DIR=/opt/hadoop-2.5.1/etc/hadoop
export SPARK_LIBRARY_PATH=/opt/hadoop-2.5.1/lib/native
不适用于 spark-shell 和 spark-submit.我的 Hadoop 本地安装配置为伪分布式(ResourceManager + YARN 支持).
Didn't work with spark-shell and spark-submit. My Hadoop local installation is configured as pseudo-distributed (ResourceManager + YARN support).
推荐答案
您应该将 HADOOP_HOME/lib/native
添加到 LD_LIBRARY_PATH
:
You should add HADOOP_HOME/lib/native
to the LD_LIBRARY_PATH
:
export LD_LIBRARY_PATH=$HADOOP_HOME/lib/native
这篇关于Apache Spark 本机库的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!