Google数据融合执行错误"INVALID_ARGUMENT:'DISKS_TOTAL_GB'配额不足.请求3000.0,可用2048.0. [英] Google data fusion Execution error "INVALID_ARGUMENT: Insufficient 'DISKS_TOTAL_GB' quota. Requested 3000.0, available 2048.0."

本文介绍了Google数据融合执行错误"INVALID_ARGUMENT:'DISKS_TOTAL_GB'配额不足.请求3000.0,可用2048.0.的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试使用Google Data Fusion Free版本将简单CSV文件从GCS加载到BQ.管道因错误而失败.它显示为

I am trying load a Simple CSV file from GCS to BQ using Google Data Fusion Free version. The pipeline is failing with error . it reads

com.google.api.gax.rpc.InvalidArgumentException: io.grpc.StatusRuntimeException: INVALID_ARGUMENT: Insufficient 'DISKS_TOTAL_GB' quota. Requested 3000.0, available 2048.0.
    at com.google.api.gax.rpc.ApiExceptionFactory.createException(ApiExceptionFactory.java:49) ~[na:na]
    at com.google.api.gax.grpc.GrpcApiExceptionFactory.create(GrpcApiExceptionFactory.java:72) ~[na:na]
    at com.google.api.gax.grpc.GrpcApiExceptionFactory.create(GrpcApiExceptionFactory.java:60) ~[na:na]
    at com.google.api.gax.grpc.GrpcExceptionCallable$ExceptionTransformingFuture.onFailure(GrpcExceptionCallable.java:97) ~[na:na]
    at com.google.api.core.ApiFutures$1.onFailure(ApiFutures.java:68) ~[na:na]

Mapreduce和Spark执行管道均重复相同的错误. 感谢您解决此问题的任何帮助.谢谢

same error is repeated for both Mapreduce and Spark execution pipeline. Appreciate any help in fixing this issue . Thanks

问候 KA

推荐答案

这意味着所请求的总计算磁盘将使该项目超过该项目的GCE配额.有项目范围和区域配额.您可以在此处查看该文档: https://cloud.google.com/compute/quotas

It means that the requested total compute disks would put the project over the GCE quota for the project. There are both project wide and regional quotas. You can see that documentation here: https://cloud.google.com/compute/quotas

要解决此问题,您应该在GCP项目中增加配额.

To resolve this, you should increase the quota in your GCP project.

这篇关于Google数据融合执行错误"INVALID_ARGUMENT:'DISKS_TOTAL_GB'配额不足.请求3000.0,可用2048.0.的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
相关文章
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆