将大文件从Google Cloud Storage加载到Google Cloud Functions? [英] Load big file from Google Cloud Storage into Google Cloud Functions?

查看:106
本文介绍了将大文件从Google Cloud Storage加载到Google Cloud Functions?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

是否可以将Google Cloud Storage中的大文件(> 100MB)加载到Google Cloud Functions中?我在他们的 quotas 中看到,后台功能的最大事件大小限制为10MB。我可以逐块阅读还是类似的内容?

Is there a way to load big files (>100MB) from Google Cloud Storage into Google Cloud Functions? I read in their quotas that the "Max event size for background functions" is limited to 10MB. Can I read it chunk-wise or something like that?

非常感谢。

推荐答案

存储的云功能由文件的元数据触发,该元数据相对较小,不会达到最大事件侧限制。

Cloud Functions for Storage are triggered with the metadata for the file, which is relatively small and won't hit the max-event-side limit.

访问文件的实际内容,您将将node.js软件包用于云存储,不受10MB限制的影响。

To access the actual contents of the file, you'll use the node.js package for Cloud Storage, which is not affected by the 10MB limit.

这篇关于将大文件从Google Cloud Storage加载到Google Cloud Functions?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
相关文章
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆