以编程方式模拟“gsutil mv”在python的appengine cloudstorage上 [英] Programmatically emulating "gsutil mv" on appengine cloudstorage in python

查看:126
本文介绍了以编程方式模拟“gsutil mv”在python的appengine cloudstorage上的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想在谷歌云存储上实施mv(云端拷贝)操作,这与gsutil的操作类似( http://developers.google.com/storage/docs/gsutil/commands/mv )。



我之前在某处读过,这涉及到读取和写入(下载和重新上传)数据,但我再也找不到这些段落。

这是在云存储中移动文件的正确方法,还是必须降级到boto库以避免通过网络复制数据以重命名文件?



<$
ostream = cloudstorage.open(dst,content_type = src_content,mode ='w')

而真:
buf = istream.read(500000)
如果不是buf:
break

ostream.write(buf)

istream.close()
ostream.close()

更新:我发现其余的API支持复制和撰写操作等等。似乎有希望我们不需要跨大陆复制数据来重命名某些内容。



有用的链接我找到了sofar ...




  • 基于Boto的方法: https:/ /developers.google.com/storage/docs/gspythonlibrary

  • GCS Clinet Lib: https://developers.google.com/appengine/docs/python/googlecloudstorageclient/

  • GCS Lib: https://code.google.com/p/appengine-gcs-client

  • RAW JSON API: https://developers.google.com/storage / docs / json_api
  • 使用JSON API,有一个 一个href =https://开发者s.google.com/storage/docs/json_api/v1/objects/copyrel =nofollow>复制方法。以下是Python的官方示例,使用 Python Google Api Client lib

     #目标对象资源完全是可选的。如果为空,我们使用
    #源对象的元数据。
    如果reuse_metadata:
    destination_object_resource = {}
    else:
    destination_object_resource = {
    'contentLanguage':'en',
    'metadata':{' my-key':'my-value'},
    }
    req = client.objects()。copy(
    sourceBucket = bucket_name,
    sourceObject = old_object,
    destinationBucket = bucket_name,
    destinationObject = new_object,
    ody = destination_object_resource)
    resp = req.execute()
    打印json.dumps(resp,indent = 2)


    I would like to implement a mv (copy-in-the-cloud) operation on google cloud storage that is similar to how gsutil does it (http://developers.google.com/storage/docs/gsutil/commands/mv).

    I read somewhere earlier that this involves a read and write (download and reupload) of the data, but I cannot find the passages again.

    Is this the correct way to move a file in cloud storage, or does one have to go a level down to the boto library to avoid copying the data over the network for renaming the file?

    istream = cloudstorage.open(src, mode='r')
    ostream = cloudstorage.open(dst, content_type=src_content, mode='w')
    
    while True:
        buf = istream.read(500000)
        if not buf:
            break
    
        ostream.write(buf)
    
    istream.close()
    ostream.close()
    

    Update: I found the rest api that supports copy and compose operations and much more. It seems that there is hope that we do not have to copy data across continents to rename something.

    Useful Links I have found sofar ...

    解决方案

    Use the JSON API, there is a copy method. Here is the official example for Python, using the Python Google Api Client lib :

    # The destination object resource is entirely optional. If empty, we use
    # the source object's metadata.
    if reuse_metadata:
        destination_object_resource = {}
    else:
        destination_object_resource = {
                'contentLanguage': 'en',
                'metadata': {'my-key': 'my-value'},
        }
    req = client.objects().copy(
            sourceBucket=bucket_name,
            sourceObject=old_object,
            destinationBucket=bucket_name,
            destinationObject=new_object,
            body=destination_object_resource)
    resp = req.execute()
    print json.dumps(resp, indent=2)
    

    这篇关于以编程方式模拟“gsutil mv”在python的appengine cloudstorage上的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆