有没有将节点添加到正在运行的Hadoop集群的方法? [英] Is there a way to add nodes to a running Hadoop cluster?

查看:96
本文介绍了有没有将节点添加到正在运行的Hadoop集群的方法?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我一直在玩Cloudera,并在开始工作之前定义了群集的数量,然后使用cloudera管理器确保一切正在运行。

I have been playing with Cloudera and I define the number of clusters before I start my job then use the cloudera manager to make sure everything is running.

我正在开发一个新项目,而不是使用hadoop使用消息队列分发工作,但工作结果存储在HBase中。我可能会启动10台服务器来处理作业并存储到Hbase,但是我想知道如果以后我决定添加更多的工作节点,我可以轻松地(读取:可编程)使它们自动连接到正在运行的群集,以便它们可以本地添加到群集HBase / HDFS?

I’m working on a new project that instead of using hadoop is using message queues to distribute the work but the results of the work are stored in HBase. I might launch 10 servers to process the job and store to Hbase but I’m wondering if I later decided to add a few more worker nodes can I easily (read: programmable) make them automatically connect to the running cluster so they can locally add to clusters HBase/HDFS?

这是可能的,我需要学习什么来完成它?

Is this possible and what would I need to learn in order to do it?

推荐答案

以下是将节点添加到 Hadoop < a>和 HBase 。查看文档,不需要重新启动集群。节点可以动态添加。

Here is the documentation for adding a node to Hadoop and for HBase. Looking at the documentation, there is no need to restart the cluster. A node can be added dynamically.

这篇关于有没有将节点添加到正在运行的Hadoop集群的方法?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆