Cloud Dataflow流作业可以缩放为零吗? [英] Can Cloud Dataflow streaming job scale to zero?

查看:77
本文介绍了Cloud Dataflow流作业可以缩放为零吗?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在使用Cloud Dataflow流管道将从Pub/Sub接收的事件插入到BigQuery数据集中.我需要一些人才能使每项工作简单易维护.

I'm using Cloud Dataflow streaming pipelines to insert events received from Pub/Sub into a BigQuery dataset. I need a few ones to keep each job simple and easy to maintain.

我担心的是全球成本.数据量不是很高.而且在一天的短时间内,没有任何数据(发布/订阅上有任何消息).

My concern is about the global cost. Volume of data is not very high. And during a few periods of the day, there isn't any data (any message on pub/sub).

我希望Dataflow扩展为0 worker,直到收到新消息.但是似乎最小的工作者是1.

I would like that Dataflow scale to 0 worker, until a new message is received. But it seems that minimum worker is 1.

因此,每天每个工作的最低价格为:24 vCPU小时...因此,每个工作每月至少$ 50. (每月使用量无折扣)

So minimum price for each job for a day would be : 24 vCPU Hour... so at least $50 a month/job. (without discount for monthly usage)

我计划每天通过api运行和耗尽我的工作,以避免1名全职员工.但这似乎不是像DataFlow这样的托管服务的正确形式.

I plan to run and drain my jobs via api a few times per day to avoid 1 full time worker. But this does not seem to be the right form for a managed service like DataFlow.

有什么我想念的吗?

推荐答案

数据流无法扩展为0个工作程序,但您的替代方法是使用Cron或此问题.

Dataflow can't scale to 0 workers, but your alternatives would be to use Cron, or Cloud Functions to create a Dataflow streaming job whenever an event triggers it, and for stopping the Dataflow job by itself, you can read the answers to this question.

您可以找到一个示例此处(对于两种情况(Cron和Cloud Functions),请注意,Cloud Functions不再处于Alpha版本中,并且自7月起处于General Availability版本中.

You can find an example here for both cases (Cron and Cloud Functions), note that Cloud Functions is not in Alpha release anymore and since July it's in General Availability release.

这篇关于Cloud Dataflow流作业可以缩放为零吗?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆