从 Google Cloud Function (Python) 将新文件写入 Google Cloud Storage 存储桶 [英] Writing a new file to a Google Cloud Storage bucket from a Google Cloud Function (Python)
问题描述
我正在尝试从 Python Google Cloud Function 内部将新文件(而不是上传现有文件)写入 Google Cloud Storage 存储桶.
I am trying to write a new file (not upload an existing file) to a Google Cloud Storage bucket from inside a Python Google Cloud Function.
我尝试使用
google-cloud-storage
但它没有存储桶的打开"属性.
I tried using
google-cloud-storage
but it does not have the "open" attribute for the bucket.
我尝试使用 App Engine 库 GoogleAppEngineCloudStorageClient
但该函数无法使用此依赖项进行部署.
I tried to use the App Engine library GoogleAppEngineCloudStorageClient
but the function cannot deploy with this dependencies.
我尝试使用 gcs-client
但我无法在函数内部传递凭据,因为它需要一个 JSON
文件.
任何想法将不胜感激.
谢谢.
推荐答案
您必须在本地创建文件,然后将其推送到 GCS.您不能使用 open 在 GCS 中动态创建文件.
You have to create your file locally and then to push it to GCS. You can't create a file dynamically in GCS by using open.
为此,您可以在 /tmp
目录中写入,该目录是内存文件系统.顺便说一句,您将永远无法创建大于函数允许的内存量减去代码的内存占用量的文件.使用 2Gb 的函数,您可以预期最大文件大小约为 1.5Gb.
For this, you can write in the /tmp
directory which is an in memory file system. By the way, you will never be able to create a file bigger than the amount of the memory allowed to your function minus the memory footprint of your code. With a function with 2Gb, you can expect a max file size of about 1.5Gb.
注意:GCS 不是文件系统,您不必像这样使用它
这篇关于从 Google Cloud Function (Python) 将新文件写入 Google Cloud Storage 存储桶的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!