Apache Spark本机库 [英] Apache Spark Native Libraries

查看:168
本文介绍了Apache Spark本机库的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我最近能够使用本机64位支持构建Apache Hadoop 2.5.1。所以,我摆脱了恼人的本地库警告。

我正在尝试配置Apache Spark。

  14/09/14 18:48:42 WARN util.NativeCodeLoader :无法为您的平台加载native-hadoop库......在适用的情况下使用builtin-java类

一些提示:

我不得不下载预先构建的2.4版本的Spark,因为Hadoop 2.5在Maven中仍然没有配置文件。



以下导出被添加到spark-env.sh:

  export HADOOP_CONF_DIR = / opt / hadoop -2.5.1 / etc / hadoop 

export SPARK_LIBRARY_PATH = / opt / hadoop-2.5.1 / lib / native

没有使用spark-shell和spark-submit。我的Hadoop本地安装配置为伪分布式(ResourceManager + YARN支持)。

解决方案

您应该添加 HADOOP_HOME / lib / native LD_LIBRARY_PATH

  export LD_LIBRARY_PATH = $ HADOOP_HOME / lib / native 


I was recently able to build Apache Hadoop 2.5.1 with native 64 bit support. So, I got rid of the annoying Native Libraries Warning.

I'm trying to configure Apache Spark. When I start spark-shell, the same warning appears:

14/09/14 18:48:42 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

Some tips:

I had to download a pre-built 2.4 version of Spark because there is still no profile for Hadoop 2.5 with Maven.

The following exports were added to spark-env.sh:

export HADOOP_CONF_DIR=/opt/hadoop-2.5.1/etc/hadoop

export SPARK_LIBRARY_PATH=/opt/hadoop-2.5.1/lib/native

Didn't work with spark-shell and spark-submit. My Hadoop local installation is configured as pseudo-distributed (ResourceManager + YARN support).

解决方案

You should add HADOOP_HOME/lib/native to the LD_LIBRARY_PATH:

export LD_LIBRARY_PATH=$HADOOP_HOME/lib/native

这篇关于Apache Spark本机库的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆