Bigquery流:'由于超时而无法插入XX行' [英] Bigquery stream: 'Failed to insert XX rows due to timeout'

查看:77
本文介绍了Bigquery流:'由于超时而无法插入XX行'的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

最近几天,我们的流式传输遇到了

 无法插入XX行,第一个错误:{errors:[ {reason:timeout}],index:YY}

半个月的连续数据流从不变的数据源和程序脚本开始,之前没有发现过这种故障。

项目ID:red-road-574

解决方案

我们的文档有点不正确,因为我们可以对行进行部分提交。如果存在无效行(结构不匹配),我们将完全拒绝请求,但是可能无法缓冲单个行。



在这种情况下,只有指示的行失败承诺。如果您有一个插入ID,您可以简单地重试失败的行,或者根据需要重试完整的请求(尽管每个重试行都会计入您的表配额)。



这些行级错误的发生率增加是由于我们处理批量插入的方式发生了变化。以前,整个请求会遇到超时。



希望有帮助。
肖恩


Recent days, our streaming met with

"Failed to insert XX rows. First error: {"errors":[{"reason":"timeout"}],"index":YY}"  

During the past half month of continuous streaming from unchanged data source and program scripts, no such failure has been found before.

project id: red-road-574

解决方案

Fellow BigQuery team member here.

It looks like our documentation is a bit incorrect, in that we can have partial commit of the rows. We'll fully reject the request if there are invalid rows (structure mismatch), but individual rows may fail to be buffered.

In this case, only the rows indicated failed to commit. If you have an insert id you can simply retry the failed rows, or retry the full request if desired (though each retried row will count against your table quota).

This increased occurrence of these row-level errors is due to a change around how we handle batches of insertions. Previously, the entire request would have encountered a timeout.

Hope that helps. Sean

这篇关于Bigquery流:'由于超时而无法插入XX行'的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆