试图用保险丝安装HDFS。不能编译libhdfs [英] Trying to use Fuse to mount HDFS. Can't compile libhdfs

查看:169
本文介绍了试图用保险丝安装HDFS。不能编译libhdfs的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我试图编译libhdfs(本地共享库,允许外部应用与HDFS接口)。这是我不得不采取用保险丝安装Hadoop的HDFS的几个步骤之一。

I'm attempting to compile libhdfs (a native shared library that allows external apps to interface with hdfs). It's one of the few steps I have to take to mount Hadoop's hdfs using Fuse.

编制似乎顺利一段时间,但以构建失败结束及以下问题汇总 -

The compilation seems to go well for a while but finishes with "BUILD FAILED" and the following problems summary -

的commons-logging#的commons-logging; 1.0.4:在的commons-logging#的commons-logging没有找到配置; 1.0.4:'主人'。这是从org.apache.hadoop#Hadoop的要求;工作@ btsotbal800的commons-logging

commons-logging#commons-logging;1.0.4: configuration not found in commons-logging#commons-logging;1.0.4: 'master'. It was required from org.apache.hadoop#Hadoop;working@btsotbal800 commons-logging

#log4j的log4j的; 1.2.15:配置在log4j的log4j的#未找到; 1.2.15:'主人'。这是从org.apache.hadoop#Hadoop的要求;工作@ btsotbal800的log4j

log4j#log4j;1.2.15: configuration not found in log4j#log4j;1.2.15: 'master'. It was required from org.apache.hadoop#Hadoop;working@btsotbal800 log4j

现在,我对这个一对夫妇的问题,在这本书我用这样做不进入任何细节什么这些东西真的是它。

Now, I have a couple questions about this, in that the book which I'm using to do this doesn't go into any details about what these things really are.


  1. 是的commons-logging和log4j的库,供Hadoop的使用?

  2. 这些库似乎生活在$ HADOOP_HOME / lib中。它们是jar文件虽然。我应该提取它们,试图改变一些配置,然后重新包装起来,放回瓶子?

  3. 什么是在错误的'主人'上面的意思吗?是否有不同版本的库?

感谢您提前为任何见解可以提供。

Thank you in advance for ANY insight you can provide.

推荐答案

如果您正在使用Hadoop的Cloudera的(cdh3u2),你不需要建立保险项目。

If you are using cloudera hadoop(cdh3u2), you dont need to build the fuse project.

您可以找到二进制文件(* libhdfs.so)目录$ HADOOP_HOME / C ++ / lib目录里面

you can find the binary(libhdfs.so*) inside the directory $HADOOP_HOME/c++/lib

在安装保险丝更新$ HADOOP_HOME /的contrib /保险丝的DFS / src目录/ fuse_dfs_wrapper.sh如下:

Before fuse mount update the "$HADOOP_HOME/contrib/fuse-dfs/src/fuse_dfs_wrapper.sh" as follows

HADOOP_HOME /的contrib /保险丝的DFS / src目录/ fuse_dfs_wrapper.sh

#!/bin/bash

for f in ${HADOOP_HOME}/hadoop*.jar ; do
   export CLASSPATH=$CLASSPATH:$f
done

for f in ${HADOOP_HOME}/lib/*.jar ; do
   export CLASSPATH=$CLASSPATH:$f
done

export PATH=$HADOOP_HOME/contrib/fuse-dfs:$PATH
export LD_LIBRARY_PATH=$HADOOP_HOME/c++/lib:/usr/lib/jvm/java-6-sun-1.6.0.26/jre/lib/amd64/server/
fuse_dfs $@

LD_LIBRARY_PATH包含目录的列表,在此结果
$ HADOOP_HOME / C ++ / lib目录中包含libhdfs.so和结果
/usr/lib/jvm/java-6-sun-1.6.0.26/jre/lib/amd64/server/载libjvm.so
\\#修改/usr/lib/jvm/java-6-sun-1.6.0.26/jre/lib/amd64/server / 作为你的JAVA_HOME

LD_LIBRARY_PATH contains the list of directories here
"$HADOOP_HOME/c++/lib" contains libhdfs.so and
"/usr/lib/jvm/java-6-sun-1.6.0.26/jre/lib/amd64/server/" contains libjvm.so \# modify /usr/lib/jvm/java-6-sun-1.6.0.26/jre/lib/amd64/server/ as your java_home

安装HDFS使用下面的命令

Use the following command for mounting hdfs

fuse_dfs_wrapper.sh dfs://localhost:9000/ /home/510600/mount1

有关卸载使用以下命令

fusermount  -u /home/510600/mount1 

我只在Hadoop中的伪模式不是在集群模式下测试保险丝

I tested fuse only in hadoop pseudo mode not in cluster mode

这篇关于试图用保险丝安装HDFS。不能编译libhdfs的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆