您可以将 keyfile.json 传递给 gsutil 吗? [英] Can you pass a keyfile.json to gsutil?

查看:26
本文介绍了您可以将 keyfile.json 传递给 gsutil 吗?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我在运行的一些 Python 脚本中有一个(可能是独一无二的?)用例.也就是说,我想要 gsutil 的并行性能,所以我不使用 from google.cloud import storage,而是使用 subprocess 调用这样的如:

I have a (maybe unique?) use case in some Python scripts that I am running. Namely, I want the parallel awesomeness of gsutil and so I don't do from google.cloud import storage, rather I use subprocess calls such as:

subprocess.Popen(["gsutil", "-q", "-m", "-o", "GSUtil:parallel_process_count=8,GSUtil:parallel_thread_count=8", "cp", files, destination])

为了从存储桶上传和下载文件.

in order to upload and download files from buckets.

在实例组模板中,我可以通过 -scopes 传入服务帐户,但我希望在应用程序级别处理身份验证.我尝试设置环境变量并将其传递给 subprocess:

In an instance group template I can pass in the service account via -scopes, but I'd like authentication to be handled at the application level. I tried setting environment variables and passing it to subprocess:

os.environ["GOOGLE_APPLICATION_CREDENTIALS"] = "keyfile.json"
tmp_env = os.environ.copy()
subprocess.Popen(['gsutil', ...], env=tmp_env)

但无济于事.运行:

gcloud auth activate-service-account --key-file /path/to/keyfile.json --project my-project -q

似乎是使用不需要 Python API 的 json 密钥文件进行身份验证的最佳方式.但是如果我把它放在我的 Dockerfile 的末尾,它就不起作用了,虽然我当然可以把它放在我在实例组模板嵌入引导程序末尾执行的 startup.sh 脚本的末尾.sh 脚本,两者都没有真正完成我想要的.也就是说,两者都偏离了我在应用程序级别进行gsutil 身份验证"的最初目标.

seems to be the best way to authenticate with a json keyfile that does not require the Python API. But it doesn't work if I throw it in at the end of my Dockerfile, and while I could of course throw it in at the end of a startup.sh script that I have executed at the end of an instance group template embedded bootstrap.sh script, neither is really accomplishing what I'd like. Namely, both get away from my original goal of having "gsutil authentication" at the application level.

tl;dr 有没有办法将 keyfile.json 凭据传递给 gsutil?这是 gsutil 团队讨论过的功能吗?如果我没有很好地搜索 Cloud Platform 和 gsutil 文档,我深表歉意.

tl;dr Is there a way to pass keyfile.json credentials to gsutil? Is this a feature the gsutil team has ever discussed? My apologies if I just haven't been hunting the Cloud Platform and gsutil docs well enough.

推荐答案

您可以在 .boto 配置文件中为 gsutil 提供指向 JSON 密钥文件的指针,例如所以:

You can provide a pointer to a JSON key file for gsutil in your .boto configuration file like so:

[Credentials]
gs_service_key_file = /path/to/your/keyfile.json

这相当于运行 gsutil config -e 以进行独立(非gcloud)安装.

This is equivalent to running gsutil config -e for a standalone (non-gcloud) install.

如果你想在命令行而不是在你的 .boto 配置文件中提供这个,你可以使用 -o 参数,类似于你如何配置命令行中的进程和线程计数.也就是说:

If you want to provide this on the command line as opposed to in your .boto configuration file, you can use the -o parameter similar to how you configured the process and thread counts in your command line. To wit:

subprocess.Popen(["gsutil", "-q", "-m", "-o", "Credentials:gs_service_key_file=/path/to/your/keyfile.json",
                 "-o", "GSUtil:parallel_process_count=8", "-o", GSUtil:parallel_thread_count=8", "cp", files, destination])

请注意,您需要确保可以从容器内访问密钥文件路径.

Note that you need to make sure the key file path is accessible from within your container.

这篇关于您可以将 keyfile.json 传递给 gsutil 吗?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆