Airflow 远程日志记录连接 - Airflow 1.7.1.3 [英] Airflow Remote Logging Connections - Airflow 1.7.1.3

查看:53
本文介绍了Airflow 远程日志记录连接 - Airflow 1.7.1.3的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我已经看过下面的帖子,但我正在尝试为气流 1.7.1.3 设置这个

要使用 GCS,您必须在连接的范围"中指明它.对于 Keyfile Path,您必须从您的项目中获取一个 json 文件.要获得此功能,您必须:

Google 控制台 → API &服务 -> 凭据(键形) -> API 凭据 -> 创建凭据 -> 服务帐户密钥 -> 选择您的帐户密钥 -> 创建

然后您将下载您的 json 文件.

您输入的Conn Id"是您的airflow.cfg 文件的remote_log_conn_id"中的那个.

示例:

I've seen the post below but I am trying to set up this for airflow 1.7.1.3

Airflow Remote logging not working

Does anyone have a specific example of the format required in connections to specify the key file for a service account to access relevant storage bucket of a project? {"project":"","key_path":""} This is what I've tried.

解决方案

It seems like your having trouble with Google credentilas. There are a lot of ways to solve it and I'll just explain my way of doing this. You must first create connection id in Airflow. You can either do it programmatically or by using Airflow web UI.

Here are instructions for doing this by UI:

Airflow web UI -> Admin -> Connections -> Create -> insert credential informations -> Save

example:

To use GCS, you must indicate it on the "Scopes" of the connection. For the Keyfile Path, you must get a json file from your project. To get this you must:

Google Console → API & Services -> Credentials(Key shaped) -> API Credentials -> Create credentials -> Service account key -> choose your account key -> Create

Then you will have your json file downloaded.

The 'Conn Id' you put is the one that goes in "remote_log_conn_id" of your airflow.cfg file.

example:

这篇关于Airflow 远程日志记录连接 - Airflow 1.7.1.3的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆