将数据从Google Cloud-SQL移至Cloud Datastore [英] Move data from Google Cloud-SQL to Cloud Datastore

本文介绍了将数据从Google Cloud-SQL移至Cloud Datastore的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试将数据从Cloud SQL移至Cloud Datastore.

I am trying to move my data from Cloud SQL to Cloud Datastore.

SQL数据库中的条目少于500万.

There are a bit under 5 million entries in the SQL database.

在出现配额错误之前,我每天只能移动超过100,000个实体.

It seems like I can only move over 100,000 entities per day before I get a quota error.

我无法确定我要超出的确切配额,但是我有指数补偿以确保发送速度不会太快.

I can't figure out which exact quota I'm exceeding, however I have exponential backoff to make sure I'm not sending it too fast.

最终它达到5分钟,并且与SQL Server的连接中断,但是我认为每秒写入的配额不是问题.而且我的API页或App Engine API页中没有其他配额超出限制.

Eventually it hits 5 minutes and the connection to the SQL server dies, but I don't think the writes per second quota is the problem. And I don't see any other quota exceeding in my APIs page or the App Engine API page.

我尝试了两种不同的API来编写记录.

I have tried two different APIs to write the records.

GCP数据存储区API

The GCP Datastore API

导入googledatastore
这是代码:
https://gist.github.com/nburn42/d8b488da1d2dc53df63f4c4a32b95def

import googledatastore
Here is the code:
https://gist.github.com/nburn42/d8b488da1d2dc53df63f4c4a32b95def

和Dataflow API
从apache_beam.io.gcp.datastore.v1.datastoreio导入WriteToDatastore
这是代码:
https://gist.github.com/nburn42/2c2a06e383aa6b04f84ed31548f1cb09

And the Dataflow API
from apache_beam.io.gcp.datastore.v1.datastoreio import WriteToDatastore
Here is the code:
https://gist.github.com/nburn42/2c2a06e383aa6b04f84ed31548f1cb09

这是我在一二十万次好的写操作后看到的错误.
RPCError:数据存储区调用提交[在运行写入数据存储区/将突变写入数据存储区时"]失败:错误代码:RESOURCE_EXHAUSTED.讯息:超出配额.

Here is the error I see after one or two hundred thousand good writes.
RPCError: datastore call commit [while running 'Write To Datastore/Write Mutation to Datastore'] failed: Error code: RESOURCE_EXHAUSTED. Message: Quota exceeded.

我正在计算引擎上运行它.

I'm running this on compute engine.

任何帮助将不胜感激! 谢谢,
内森

Any help is greatly appreciated! Thanks,
Nathan

推荐答案

我要求增加配额,而google上的某人检查了我的帐户以查找问题.

I asked for a quota increase and someone at google checked my account to find the problem.

这是他们的回复.

我了解您想知道自己的具体配额是多少 每当您尝试将Cloud SQL备份到Cloud Datastore时都可以访问.

I understand that you want to know what specific quota you are reaching whenever you try to backup your Cloud SQL to Cloud Datastore.

检查项目后,问题似乎出在您的应用程序上 引擎应用程序达到或接近其支出限额.从这个时候开始 写入时,您执行的数据存储区写入操作的成本 1.10 $,将在5个小时后刷新.绝对可以 导致您的资源在每日消费之前无法使用 限制被补充.请尝试增加您的支出限额 尽快避免服务中断,然后运行或执行 您的数据存储区写操作.

Upon checking your project, it seems that the problem is that your App Engine application is at or near its spending limit. As of this time of writing, the Datastore Write Operations you have executed costed you 1.10$, which will be refreshed after 5 hours. It can definitely cause your resources to become unavailable until the daily spending limit is replenished. Kindly try to increase your spending limit as soon as possible to avoid service interruption and then run or execute your datastore write operations.

试一下,让我知道会发生什么.我会寻找 转发给您的回复.

Give this a shot and let me know what happens. I will be looking forward to your reply.

这解决了问题.我只需要进入应用引擎并设置更高的每日支出限额即可.

This fixed the problem. I just needed to go into app engine and set a much higher daily spending limit.

希望我上面包含的代码对其他人有帮助.

Hopefully the code I included above will help others.

这篇关于将数据从Google Cloud-SQL移至Cloud Datastore的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆