使用Google Cloud Storage python客户端进行批处理请求 [英] Batch request with Google Cloud Storage python client

查看:97
本文介绍了使用Google Cloud Storage python客户端进行批处理请求的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我找不到任何有关如何使用python google cloud存储的批处理功能的示例.我看到它存在此处.

I can't find any examples on how to use the python google cloud storage's batch functionality. I see it exists here.

我喜欢一个具体的例子.假设我要删除一堆具有给定前缀的Blob.我将开始按如下方式获取Blob列表

I'd love a concrete example. Let's say I want to delete a bunch of blobs with a given prefix. I'd start getting the list of blobs as follows

from google.cloud import storage

storage_client = storage.Client()
bucket = storage_client.get_bucket('my_bucket_name')
blobs_to_delete = bucket.list_blobs(prefix="my/prefix/here")

# how do I delete the blobs in blobs_to_delete in a single batch?

# bonus: if I have more than 100 blobs to delete, handle the limitation
#        that a batch can only handle 100 operations

推荐答案

TL; DR -只需发送

TL;DR - Just send all the requests within the batch() context manager (available in the google-cloud-python library)

尝试以下示例:

from google.cloud import storage

storage_client = storage.Client()
bucket = storage_client.get_bucket('my_bucket_name')
# Accumulate the iterated results in a list prior to issuing
# batch within the context manager
blobs_to_delete = [blob for blob in bucket.list_blobs(prefix="my/prefix/here")]

# Use the batch context manager to delete all the blobs    
with storage_client.batch():
    for blob in blobs_to_delete:
        blob.delete()

如果直接使用REST API,则只需担心每批100个项目. 上下文管理器自动处理此限制,并在需要时发出多个批处理请求.

You only need to worry about the 100 items per batch if you're using the REST APIs directly. The batch() context manager automatically takes care of this restriction and will issue multiple batch requests if needed.

这篇关于使用Google Cloud Storage python客户端进行批处理请求的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆