通过云函数从云存储中读取数据 [英] Reading Data From Cloud Storage Via Cloud Functions

本文介绍了通过云函数从云存储中读取数据的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试快速验证在 Python 中构建数据处理管道的概念.为此,我想构建一个 Google 函数,当某些 .csv 文件被放入 Cloud Storage 时会触发该函数.

I am trying to do a quick proof of concept for building a data processing pipeline in Python. To do this, I want to build a Google Function which will be triggered when certain .csv files will be dropped into Cloud Storage.

我遵循 此 Google Functions Python 教程,而示例代码确实如此当文件被删除时触发函数创建一些简单的日志,我真的被困在我必须进行的调用才能真正读取数据的内容.我尝试搜索 SDK/API 指导文档,但找不到.

I followed along this Google Functions Python tutorial and while the sample code does trigger the Function to create some simple logs when a file is dropped, I am really stuck on what call I have to make to actually read the contents of the data. I tried to search for an SDK/API guidance document but I have not been able to find it.

如果这是相关的,一旦我处理了 .csv,我希望能够将我从中提取的一些数据添加到 GCP 的 Pub/Sub 中.

In case this is relevant, once I process the .csv, I want to be able to add some data that I extract from it into GCP's Pub/Sub.

推荐答案

该函数实际上并不接收文件的内容,只是一些关于它的元数据.

The function does not actually receive the contents of the file, just some metadata about it.

您需要使用 google-cloud-storage 客户端.有关更多信息,请参阅下载对象"指南详情.

You'll want to use the google-cloud-storage client. See the "Downloading Objects" guide for more details.

将它与您正在使用的教程放在一起,您将获得如下功能:

Putting that together with the tutorial you're using, you get a function like:

from google.cloud import storage

storage_client = storage.Client()

def hello_gcs_generic(data, context):
    bucket = storage_client.get_bucket(data['bucket'])
    blob = bucket.blob(data['name'])
    contents = blob.download_as_string()
    # Process the file contents, etc...

这篇关于通过云函数从云存储中读取数据的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆