使用Airflow从本地系统将文件上传到Google Bucket-Python [英] Uploading a file to Google Bucket from local system using Airflow - Python
本文介绍了使用Airflow从本地系统将文件上传到Google Bucket-Python的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!
问题描述
所以我正在以气流方式运行此操作符
So I am running this operator on airflow
from airflow.contrib.operators.file_to_gcs import FileToGoogleCloudStorageOperator
gcp_operator = \
FileToGoogleCloudStorageOperator(
task_id='gcp_task',
src='/Users/john/Documents/tmp',
dst='gs://constantine-bucket',
bucket='constantine-bucket',
google_cloud_storage_conn_id='DataScience',
mime_type='Folder',
dag=dag
)
运行此命令时出现错误
"error": "invalid_scope",
"error_description": "\u0026quot;https://www.googleapis.com/auth/devstorage.read_write\u0026quot; is not a valid audience string
有人知道如何在气流上运行该操作员吗?
Does anyone have an idea about how to run this operator on airflow?
推荐答案
听起来像缺少依赖项.您应该能够使用以下命令安装GCP挂钩和操作符:
Sounds like a missing dependency. You should be able to install the GCP hooks and operators using:
pip install apache-airflow[gcp_api]
有关更多信息,请参阅此页面: https://airflow.apache.org/installation.html
For more info, refer to this page: https://airflow.apache.org/installation.html
这篇关于使用Airflow从本地系统将文件上传到Google Bucket-Python的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!
查看全文