hadoop:配置文件 [英] hadoop: configuration files

查看:124
本文介绍了hadoop:配置文件的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我是Hadoop的新手。我试图设置单节点集群。

I am new to Hadoop. I am trying to setup a single-node cluster.

我注意到,在我读过的文档中(即使在Apache的配置站点),它总是引用conf /目录中的配置文件。但是,当我下载版本2.X.X我只看到配置文件在etc / hadoop目录。

I have noticed that in the documentation i've read (even on Apache's config site) it always refers to the configuration files in conf/ directory. However, when i download version 2.X.X i only see config files in the etc/hadoop directory.

我已经把这个问题弄出来了。我试过阅读hadoop文档,但它指的是'conf'目录,如前所述。

I have googled the heck out of this. i tried reading the hadoop documentation, but it refers to the 'conf' directory, as explained before.

所以,我的问题是:我只是配置文件在他们是,在etc / hadoop目录,或者我需要将它们移动到conf目录创建它自己?)。

So, my question is: Do i just configure the files where they are, in the etc/hadoop directory, or do i need to move them to the conf directory (create it myself?).

感谢

推荐答案

在Hadoop 2中,etc / hadoop目录本身确实是conf目录,无需创建单独的一个。测试这个的一个快速方法只是修改像fs.default.name之间的文件:///和您的hdfs:// host:port / setting之间的一个快速hadoop fs -ls,看看你结束了。

In Hadoop 2, the etc/hadoop directory itself is indeed the conf directory, no need to create a separate one. A quick way to test this is just to modify something like fs.default.name between file:/// and your hdfs://host:port/ setting and run a quick "hadoop fs -ls" to see where you end up.

这篇关于hadoop:配置文件的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆