在OS/X上找不到Hadoop本机库 [英] Hadoop native libraries not found on OS/X

查看:80
本文介绍了在OS/X上找不到Hadoop本机库的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我已经从github下载了hadoop源代码,并使用native选项进行了编译:

I have downloaded hadoop source code from github and compiled with the native option:

mvn package -Pdist,native -DskipTests -Dtar -Dmaven.javadoc.skip=true

然后我将.dylib文件复制到$ HADOOP_HOME/lib

I then copied the .dylib files to the $HADOOP_HOME/lib

cp -p hadoop-common-project/hadoop-common/target/hadoop-common-2.7.1/lib/native/*.dylib /usr/local/Cellar/hadoop/2.7.2/libexec/share/hadoop/lib

LD_LIBRARY_PATH已更新,并且hdfs重新启动:

The LD_LIBRARY_PATH was updated and hdfs restarted:

 echo $LD_LIBRARY_PATH
 /usr/local/Cellar/hadoop/2.7.2/libexec/lib:
 /usr/local/Cellar/hadoop/2.7.2/libexec/share/hadoop/common/lib:/Library/Java/JavaVirtualMachines/jdk1.8.0_92.jdk/Contents/Home//jre/lib

(注意:这也意味着对对我不起作用.)

(Note: this also means that the answer to Hadoop "Unable to load native-hadoop library for your platform" error on docker-spark? does not work for me..)

但是checknative仍然统一返回false:

$stop-dfs.sh && start-dfs.sh && hadoop checknative
16/06/13 16:12:32 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Stopping namenodes on [sparkbook]
sparkbook: stopping namenode
localhost: stopping datanode
Stopping secondary namenodes [0.0.0.0]
0.0.0.0: stopping secondarynamenode
16/06/13 16:12:50 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/06/13 16:12:50 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [sparkbook]
sparkbook: starting namenode, logging to /usr/local/Cellar/hadoop/2.7.2/libexec/logs/hadoop-macuser-namenode-sparkbook.out
localhost: starting datanode, logging to /usr/local/Cellar/hadoop/2.7.2/libexec/logs/hadoop-macuser-datanode-sparkbook.out
Starting secondary namenodes [0.0.0.0]
0.0.0.0: starting secondarynamenode, logging to /usr/local/Cellar/hadoop/2.7.2/libexec/logs/hadoop-macuser-secondarynamenode-sparkbook.out
16/06/13 16:13:05 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/06/13 16:13:05 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Native library checking:
hadoop:  false
zlib:    false
snappy:  false
lz4:     false
bzip2:   false
openssl: false

推荐答案

要在全新安装的macOS 10.12上进行此工作,我必须执行以下操作:

To get this working on a fresh install of macOS 10.12, I had to do the following:

  1. 使用自制软件安装构建依赖项:

brew install cmake maven openssl protobuf@2.5 snappy

  • 查看hadoop源代码

  • Check out hadoop source code

    git clone https://github.com/apache/hadoop.git
    cd hadoop
    git checkout rel/release-2.7.3
    

  • 将以下补丁应用于构建:

  • Apply the below patch to the build:

    diff --git a/hadoop-common-project/hadoop-common/src/CMakeLists.txt b/hadoop-common-project/hadoop-common/src/CMakeLists.txt
    index 942b19c..8b34881 100644
    --- a/hadoop-common-project/hadoop-common/src/CMakeLists.txt
    +++ b/hadoop-common-project/hadoop-common/src/CMakeLists.txt
    @@ -16,6 +16,8 @@
     # limitations under the License.
     #
    
    +SET(CUSTOM_OPENSSL_PREFIX /usr/local/opt/openssl)
    +
     cmake_minimum_required(VERSION 2.6 FATAL_ERROR)
    
     # Default to release builds
    @@ -116,8 +118,8 @@ set(T main/native/src/test/org/apache/hadoop)
     GET_FILENAME_COMPONENT(HADOOP_ZLIB_LIBRARY ${ZLIB_LIBRARIES} NAME)
    
     SET(STORED_CMAKE_FIND_LIBRARY_SUFFIXES ${CMAKE_FIND_LIBRARY_SUFFIXES})
    -set_find_shared_library_version("1")
    -find_package(BZip2 QUIET)
    +set_find_shared_library_version("1.0")
    +find_package(BZip2 REQUIRED)
     if (BZIP2_INCLUDE_DIR AND BZIP2_LIBRARIES)
         GET_FILENAME_COMPONENT(HADOOP_BZIP2_LIBRARY ${BZIP2_LIBRARIES} NAME)
         set(BZIP2_SOURCE_FILES
    diff --git a/hadoop-common-project/hadoop-common/src/main/conf/core-site.xml b/hadoop-common-project/hadoop-common/src/main/conf/core-site.xml
    index d2ddf89..ac8e351 100644
    --- a/hadoop-common-project/hadoop-common/src/main/conf/core-site.xml
    +++ b/hadoop-common-project/hadoop-common/src/main/conf/core-site.xml
    @@ -17,4 +17,8 @@
     <!-- Put site-specific property overrides in this file. -->
    
     <configuration>
    +<property>
    +<name>io.compression.codec.bzip2.library</name>
    +<value>libbz2.dylib</value>
    +</property>
     </configuration>
    diff --git a/hadoop-tools/hadoop-pipes/pom.xml b/hadoop-tools/hadoop-pipes/pom.xml
    index 34c0110..70f23a4 100644
    --- a/hadoop-tools/hadoop-pipes/pom.xml
    +++ b/hadoop-tools/hadoop-pipes/pom.xml
    @@ -52,7 +52,7 @@
                         <mkdir dir="${project.build.directory}/native"/>
                         <exec executable="cmake" dir="${project.build.directory}/native" 
                             failonerror="true">
    -                      <arg line="${basedir}/src/ -DJVM_ARCH_DATA_MODEL=${sun.arch.data.model}"/>
    +                      <arg line="${basedir}/src/ -DJVM_ARCH_DATA_MODEL=${sun.arch.data.model} -DOPENSSL_ROOT_DIR=/usr/local/opt/openssl"/>
                         </exec>
                         <exec executable="make" dir="${project.build.directory}/native" failonerror="true">
                           <arg line="VERBOSE=1"/>
    

  • 从源代码构建hadoop:

  • Build hadoop from source:

    mvn package -Pdist,native -DskipTests -Dtar -Dmaven.javadoc.skip=true
    

  • 在运行hadoop时指定JAVA_LIBRARY_PATH:

    $ JAVA_LIBRARY_PATH=/usr/local/opt/openssl/lib:/opt/local/lib:/usr/lib hadoop-dist/target/hadoop-2.7.3/bin/hadoop checknative -a
    16/10/14 20:16:32 INFO bzip2.Bzip2Factory: Successfully loaded & initialized native-bzip2 library libbz2.dylib
    16/10/14 20:16:32 INFO zlib.ZlibFactory: Successfully loaded & initialized native-zlib library
    Native library checking:
    hadoop:  true /Users/admin/Desktop/hadoop/hadoop-dist/target/hadoop-2.7.3/lib/native/libhadoop.dylib
    zlib:    true /usr/lib/libz.1.dylib
    snappy:  true /usr/local/lib/libsnappy.1.dylib
    lz4:     true revision:99
    bzip2:   true /usr/lib/libbz2.1.0.dylib
    openssl: true /usr/local/opt/openssl/lib/libcrypto.dylib
    

  • 这篇关于在OS/X上找不到Hadoop本机库的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

    查看全文
    登录 关闭
    扫码关注1秒登录
    发送“验证码”获取 | 15天全站免登陆