Google Cloud Datastore:使用Node.js批量导入 [英] Google Cloud Datastore: Bulk Importing w Node.js

查看:87
本文介绍了Google Cloud Datastore:使用Node.js批量导入的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我需要将大量实体(.csv文件中的150万行)写入Google Cloud Datastore.分为两部分的问题:

我可以做(还是kind是必需的属性?):

const item = {
    family: "chevrolet",
    series: "impala",
    data: {
        sku: "chev-impala",
        description: "Chevrolet Impala Sedan",
        price: "20000"
    }
}

然后,关于导入,我不确定作品.如果我不能简单地转储/上传/导入巨大的.json文件,我想使用Node.js.我希望每个实体都有一个自动生成的通用ID.有异步的书写方式吗?我有一个节点脚本,它一次输出数百个实体/记录,并暂停等待写resolve. ...这就是我要寻找的:promise import.

解决方案

您可以使用Apache Beam将数据从CSV文件导入Cloud Datastore.看一下该线程:将CSV导入到Google云数据存储中.

文档此处中说明了如何使用实体. /p>

导出和导入实体是一项完全托管的服务,您可以仅导入以前使用托管导出和导入服务导出的实体.

I'm need to write a huge quantity of entities (1.5 million lines from a .csv file) to Google Cloud Datastore. Kind of a 2 part question:

Can I do (or is kind a necessary property?):

const item = {
    family: "chevrolet",
    series: "impala",
    data: {
        sku: "chev-impala",
        description: "Chevrolet Impala Sedan",
        price: "20000"
    }
}

then, regarding importing I'm unsure of how this works. If I can't simply dump/upload/import a huge .json file, I wanted to use Node.js. I would like each entity to have an autogenerated universal id. Is there an asynchronous means of writing? I have a node script that is piping out a few hundred enteties/records at a time and pausing awaiting write resolve. ...which is what I'm looking for : a promise import.

解决方案

You can use Apache Beam to import data from a CSV file to Cloud Datastore. Take a look in to the thread: Import CSV into google cloud datastore.

How to work with entities is explained in the documentation here.

Exporting and Importing Entities is a fully managed service and you can import just entities previously exported with the managed export and import service.

这篇关于Google Cloud Datastore:使用Node.js批量导入的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆