如何将日志从Azure Databricks重定向到另一个目标? [英] How to re-direct logs from Azure Databricks to another destination?

查看:98
本文介绍了如何将日志从Azure Databricks重定向到另一个目标?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我们可以在如何将Spark驱动程序和工作日志发送到Azure Databricks以外的目标方面使用一些帮助,例如使用Eleastic-beats进行Azure Blob存储或弹性搜索.

We could use some help on how to send Spark Driver and worker logs to a destination outside Azure Databricks, like e.g. Azure Blob storage or Elastic search using Eleastic-beats.

在配置新集群时,获取注册日志传递目标的唯一选项是dbfs,请参见

When configuring a new cluster, the only options on get reg log delivery destination is dbfs, see

https://docs.azuredatabricks.net/user-guide/clusters/log-delivery.html .

非常感谢任何输入,谢谢!

Any input much appreciated, thanks!

推荐答案

以下内容可能会有所帮助:

Maybe the following could be helpful :

首先,为Spark驱动程序和工作日志指定dbfs位置.
https://docs.databricks.com/user-guide/clusters/log-delivery.html

First you specify a dbfs location for your Spark driver and worker logs.
https://docs.databricks.com/user-guide/clusters/log-delivery.html

然后,创建一个安装点,该安装点将dbfs文件夹链接到Blob存储容器.

Then, you create a mount point that links your dbfs folder to a Blob Storage container. https://docs.databricks.com/spark/latest/data-sources/azure/azure-storage.html#mount-azure-blob-storage-containers-with-dbfs

希望这项帮助!

这篇关于如何将日志从Azure Databricks重定向到另一个目标?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆