自动将文件从Google云端存储上传到Bigquery [英] Automate file upload from Google Cloud Storage to Bigquery
问题描述
我们设置了一个自动FTP程序,该程序每天将数据文件导入Google Cloud Storage.
We have an automated FTP process set up which imports a data file into Google Cloud Storage daily.
我想设置一个每日自动作业,将该csv上载到bigquery表中.
I would like to set up a daily automated job that uploads this csv into a bigquery table.
做到这一点的最佳方法是什么?我目前的第一个想法是使用cron作业设置一个App Engine实例,该实例每天运行python脚本.有更好的解决方案吗?
What is the best way to do this? My current first thought is to set up an app engine instance with a cron job that runs a python script every day. Is there a better solution?
推荐答案
Background Cloud Function
与 Cloud Storage trigger
是您的最佳选择!
Background Cloud Function
with a Cloud Storage trigger
is your best choice!
您可以将其设置为监视特定存储区中的新文件,并在触发触发器时执行加载脚本
You can set it to monitor specific bucket for new files and execute load script whenever trigger is fired
忘了提及-Cloud Functions仅支持node.js进行脚本编写-到目前为止通常不成问题,但只想提及:o)
Forgot to mention - Cloud Functions support (as of now) only node.js for scripting - which usually not a problem but just wanted to mention :o)
这篇关于自动将文件从Google云端存储上传到Bigquery的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!