通过云功能从云存储中读取数据 [英] Reading Data From Cloud Storage Via Cloud Functions
问题描述
我正在尝试对在Python中建立数据处理管道的概念进行快速证明.为此,我想构建一个Google Function,当将某些.csv文件放入Cloud Storage时将触发该函数.
I am trying to do a quick proof of concept for building a data processing pipeline in Python. To do this, I want to build a Google Function which will be triggered when certain .csv files will be dropped into Cloud Storage.
我遵循了此Google Functions Python教程,而示例代码确实当文件被删除时,触发函数创建一些简单的日志,我真的陷入了必须调用什么才能真正读取数据内容的麻烦.我试图搜索SDK/API指导文档,但找不到.
I followed along this Google Functions Python tutorial and while the sample code does trigger the Function to create some simple logs when a file is dropped, I am really stuck on what call I have to make to actually read the contents of the data. I tried to search for an SDK/API guidance document but I have not been able to find it.
在这种情况下,一旦我处理了.csv,我希望能够将从中提取的一些数据添加到GCP的发布/订阅中.
In case this is relevant, once I process the .csv, I want to be able to add some data that I extract from it into GCP's Pub/Sub.
推荐答案
该函数实际上并不接收文件的内容,而只是接收有关文件的一些元数据.
The function does not actually receive the contents of the file, just some metadata about it.
您将要使用 google-cloud-storage
客户端.有关更多信息,请参见下载对象" 指南.详细信息.
You'll want to use the google-cloud-storage
client. See the "Downloading Objects" guide for more details.
将其与您正在使用的教程放在一起,您将获得类似以下的功能:
Putting that together with the tutorial you're using, you get a function like:
from google.cloud import storage
storage_client = storage.Client()
def hello_gcs_generic(data, context):
bucket = storage_client.get_bucket(data['bucket'])
blob = bucket.blob(data['name'])
contents = blob.download_as_string()
# Process the file contents, etc...
这篇关于通过云功能从云存储中读取数据的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!