如何使用Python API在Google Cloud Storage上上传文件夹 [英] How to upload folder on Google Cloud Storage using Python API
本文介绍了如何使用Python API在Google Cloud Storage上上传文件夹的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!
问题描述
我已成功将单个文本文件上传到 Google Cloud Storage
上.但是,当我尝试上传整个文件夹
时,它会授予权限拒绝的错误.
I have successfully uploaded single text file on Google Cloud Storage
. But when i try to upload whole folder
, It gives permission denied error.
filename = "d:/foldername" #here test1 is the folder.
Error:
Traceback (most recent call last):
File "test1.py", line 142, in <module>
upload()
File "test1.py", line 106, in upload
media = MediaFileUpload(filename, chunksize=CHUNKSIZE, resumable=True)
File "D:\jatin\Project\GAE_django\GCS_test\oauth2client\util.py", line 132, in positional_wrapper
return wrapped(*args, **kwargs)
File "D:\jatin\Project\GAE_django\GCS_test\apiclient\http.py", line 422, in __init__
fd = open(self._filename, 'rb')
IOError: [Errno 13] Permission denied: 'd:/foldername'
推荐答案
这对我有用.将所有内容从本地目录复制到Google云存储中的特定存储区名称/完整路径(递归):
This works for me. Copy all content from a local directory to a specific bucket-name/full-path (recursive) in google cloud storage:
import glob
from google.cloud import storage
def upload_local_directory_to_gcs(local_path, bucket, gcs_path):
assert os.path.isdir(local_path)
for local_file in glob.glob(local_path + '/**'):
if not os.path.isfile(local_file):
upload_local_directory_to_gcs(local_file, bucket, gcs_path + "/" + os.path.basename(local_file))
else:
remote_path = os.path.join(gcs_path, local_file[1 + len(local_path):])
blob = bucket.blob(remote_path)
blob.upload_from_filename(local_file)
upload_local_directory_to_gcs(local_path, bucket, BUCKET_FOLDER_DIR)
这篇关于如何使用Python API在Google Cloud Storage上上传文件夹的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!
查看全文