如何从Google BigQuery将数据加载到Google Cloud Bigtable中 [英] How to load data into Google Cloud Bigtable from Google BigQuery

查看:212
本文介绍了如何从Google BigQuery将数据加载到Google Cloud Bigtable中的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我需要将数据填充到Google Cloud Bigtable中,并且数据的来源将是Google BigQuery。

作为练习,我可以读取数据来自BigQuery ,作为一项单独的练习,我能够将数据写入Bigtable



现在我必须将这两个操作合并为一个Google Cloud Dataflow作业。任何示例都会有很大的帮助。

解决方案

您可以使用这些示例中所示的转换,添加所需的任何逻辑例如:

  Pipeline p = Pipeline.create(options); 
.apply(BigQueryIO.Read.from(some_table))
.apply(ParDo.of(new DoFn< TableRow,Row>(){
public void processElement(ProcessContext c) {
Row output = somehowConvertYourDataToARow(c.element());
c.output(output);
}
})
.apply(BigtableIO.Write。 withTableId(some_other_table);


I need to populate data into Google Cloud Bigtable and the source of the data will be Google BigQuery.

As an exercise, I am able to read the data from BigQuery and as an seperate exercise I am able to write data into Bigtable as well.

Now I have to combine these 2 operations into one Google Cloud Dataflow job. Any example will be of great help.

解决方案

You can just use the transforms as shown in those examples, adding whatever logic you need in between, for example:

Pipeline p = Pipeline.create(options);
 .apply(BigQueryIO.Read.from("some_table"))
 .apply(ParDo.of(new DoFn<TableRow, Row>() {
   public void processElement(ProcessContext c) {
     Row output = somehowConvertYourDataToARow(c.element());
     c.output(output);
   }
   })
 .apply(BigtableIO.Write.withTableId("some_other_table");

这篇关于如何从Google BigQuery将数据加载到Google Cloud Bigtable中的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
相关文章
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆