如何在火花中配置hbase? [英] How to configure hbase in spark?

查看:119
本文介绍了如何在火花中配置hbase?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

连接spark到hbase的步骤是什么?

我有两个主要地址。我只是将hbase地址添加到spark类路径中?

解决方案

这篇关于连接Spark与HBase的文章应该很有帮助: http: //www.vidyasource.com/blog/Programming/Scala/Java/Data/Hadoop/Analytics/2014/01/25/lighting-a-spark-with-hbase


我是否只需将hbase地址添加到spark类路径中?

否。实际上,你应该把hbase配置文件放在spark classpath中。如果没有,你应该在你的代码中设置它们,例如:

 配置hConf = HBaseConfiguration.create(conf); 
hConf.set(hbase.zookeeper.quorum,PDHadoop1.corp.CompanyName.com,PDHadoop2.corp.CompanyName.com);
hConf.setInt(hbase.zookeeper.property.clientPort,10000);


What are the steps to connect spark to hbase?

I have the master addresses for both. Do I just add the hbase address into spark classpath?

解决方案

This post about connecting Spark With HBase should be helpful: http://www.vidyasource.com/blog/Programming/Scala/Java/Data/Hadoop/Analytics/2014/01/25/lighting-a-spark-with-hbase

Do I just add the hbase address into spark classpath?

No. Actually, you should put the hbase configuraion files in the spark classpath. If not, you should set them in your codes, such as:

    Configuration hConf = HBaseConfiguration.create(conf);
    hConf.set("hbase.zookeeper.quorum", "PDHadoop1.corp.CompanyName.com,PDHadoop2.corp.CompanyName.com");
    hConf.setInt("hbase.zookeeper.property.clientPort", 10000);

这篇关于如何在火花中配置hbase?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆