从Google Cloud Function(Python)将新文件写入Google Cloud Storage存储桶 [英] Writing a new file to a Google Cloud Storage bucket from a Google Cloud Function (Python)

查看:91
本文介绍了从Google Cloud Function(Python)将新文件写入Google Cloud Storage存储桶的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试从Python Google Cloud Function内部将新文件(而不是上传现有文件)写入Google Cloud Storage存储桶.

I am trying to write a new file (not upload an existing file) to a Google Cloud Storage bucket from inside a Python Google Cloud Function.

  • I tried using google-cloud-storage but it does not have the "open" attribute for the bucket.

我尝试使用App Engine库 GoogleAppEngineCloudStorageClient ,但该函数无法使用此依赖项进行部署.

I tried to use the App Engine library GoogleAppEngineCloudStorageClient but the function cannot deploy with this dependencies.

我尝试使用 gcs-client ,但我无法在函数内部传递凭据,因为它需要 JSON 文件.

任何想法都将不胜感激.

Any ideas would be much appreciated.

谢谢.

推荐答案

您必须在本地创建文件,然后将其推送到GCS.您无法使用open在GCS中动态创建文件.

You have to create your file locally and then to push it to GCS. You can't create a file dynamically in GCS by using open.

为此,您可以在内存文件系统的/tmp 目录中进行写入.顺便说一句,您将无法创建大于函数所允许的内存量减去代码的内存占用量的文件.借助具有2Gb的功能,您可以期望最大文件大小约为1.5Gb.

For this, you can write in the /tmp directory which is an in memory file system. By the way, you will never be able to create a file bigger than the amount of the memory allowed to your function minus the memory footprint of your code. With a function with 2Gb, you can expect a max file size of about 1.5Gb.

注意:GCS不是文件系统,因此您不必像这样使用它

这篇关于从Google Cloud Function(Python)将新文件写入Google Cloud Storage存储桶的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
相关文章
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆