Spark rdd写入Hbase [英] Spark rdd write to Hbase

查看:404
本文介绍了Spark rdd写入Hbase的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我可以使用以下代码从Kafka中读取消息:

I am able to read the messages from Kafka using the below code:

val ssc = new StreamingContext(sc, Seconds(50)) 
val topicmap = Map("test" -> 1)
val lines = KafkaUtils.createStream(ssc,"127.0.0.1:2181", "test-consumer-group",topicmap)

但是,我试图读取来自Kafka的每条消息并将其放入HBase.这是我写到HBase的代码,但没有成功.

But, I am trying to read each message from Kafka and putting into HBase. This is my code to write into HBase but no success.

lines.foreachRDD(rdd => {
  rdd.foreach(record => {
    val i = +1
    val hConf = new HBaseConfiguration() 
    val hTable = new HTable(hConf, "test") 
    val thePut = new Put(Bytes.toBytes(i)) 
    thePut.add(Bytes.toBytes("cf"), Bytes.toBytes("a"), Bytes.toBytes(record)) 
  })
})

推荐答案

好吧,您实际上并没有执行Put任务,只是在创建一个Put请求并向其中添加数据.您所缺少的是

Well, you are not actually executing the Put, you are mereley creating a Put request and adding data to it. What you are missing is an

hTable.put(thePut);

这篇关于Spark rdd写入Hbase的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆