气流远程日志记录不起作用 [英] Airflow Remote logging not working

查看:88
本文介绍了气流远程日志记录不起作用的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有一个正在运行的Apache-Airflow 1.8.1实例.

I have a up and running Apache - Airflow 1.8.1 instance.

我有一个有效的连接(及其ID)可以写入Google Cloud Storage,并且气流用户具有写入存储桶的权限.

I got a working connection (and it's ID) to write to Google Cloud Storage and my airflow user has the permission to write to the bucket.

我尝试通过添加使用远程日志存储功能

I try to use the remote log storage functionality by adding

remote_base_log_folder = 'gs://my-bucket/log'

remote_log_conn_id = 'my_working_conn_id'

仅此而已(我没有触摸任何配置,但是)

And that's all (I didn't touch any configuration but that)

我重新启动了所有服务,但是日志没有上载到gcs(我的存储桶中仍然是空的),并且我的文件系统空间仍在减少.

I restarted all the services but the log aren't uploading to gcs (my bucket it's still empty) and my filesystem space is still decreasing.

您是否已成功使用gcs启用了远程日志?如果是,您做了什么更改/做什么?

Have you enabled successfully remote log with gcs? If yes, what did you change / do?

推荐答案

我设法将远程日志获取到GCS.首先,您需要授予服务帐户写入GCS存储桶的权限.

I manage to get the remote log to GCS. First, you need to give the service account permission to write to GCS bucket.

这是我的GCP连接设置:

This is my GCP connection set up:

然后,编辑airflow.cfg文件:

Then, edit the airflow.cfg file:

remote_base_log_folder = gs://my-backup/airflow_logs
remote_log_conn_id = my_gcp_conn

编辑配置文件后,您需要再次重新初始化它:

After editing the config file, you need to re-initialize it again:

airflow initdb

# start the web server, default port is 8080
airflow webserver -p 8080

通过打开教程" DAG进行测试,您应该能够在GCS中本地和远程查看日志:

Testing by turning on the "tutorial" DAG, you should be able to see the logs both locally and remotely in GCS:

这篇关于气流远程日志记录不起作用的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆