数据流作业是否会触发Bigquery配额和限制? [英] Does Dataflow jobs hit any Bigquery quotas and limits?

查看:111
本文介绍了数据流作业是否会触发Bigquery配额和限制?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有大约1500个工作要使用Dataflow来实施。这些工作将按日计划。我们可能会在作业中使用BigQuery客户端库来使用大量的DML语句。列出我对Bigquery配额和限制的担忧。



参考: https://cloud.google.com/bigquery/quotas

请确认我们是否需要在以下任何情况下考虑Bigquery的每日使用限制。


  1. 如果我们使用 BigqueryIO.write()
  2. 实现数据插入,则
  3. 如果我们在Dataflow作业中使用BigQuery客户端库使用DML语句(更新/删除)

请建议。

解决方案

绝对 do 需要考虑BigQuery配额和限制 - 即使从Dataflow中挂钩。



Dataflow仅代表您调用BigQuery API。因此,所有配额和限制仍然适用,就像您自己直接调用它一样。


I have around 1500 jobs to be implemented using Dataflow. Those jobs will be scheduled on daily basis. We may get to use huge number of DML statements using Bigquery Client library within our jobs. Listing down my concerns regarding Bigquery quotas and limits.

Reference: https://cloud.google.com/bigquery/quotas

Please confirm that do we need to take the daily usage limits of Bigquery into consideration in any of the below mentioned scenarios.

  1. If we implement data inserts using BigqueryIO.write()
  2. If we use DML statements (Update/Delete) using Bigquery Client Library within the Dataflow job

Please suggest.

解决方案

You absolutely do need to take BigQuery quotas and limits into consideration - even when hooking into it from Dataflow.

Dataflow is just calling the BigQuery API on your behalf. Therefore, all quotas and limits still apply as if you were calling it directly yourself.

这篇关于数据流作业是否会触发Bigquery配额和限制?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆