向ADF抛出Spark作业错误 [英] throw Spark job error to ADF

查看:78
本文介绍了向ADF抛出Spark作业错误的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在使用Azure数据工厂来运行spark脚本.如果两次之间的火花处理失败,则有什么方法可以向ADF抛出实际错误并显示ADF刀片.当前它将显示通用消息.为了看到实际的消息我们是否应该去 到YARN UI或检查存储帐户中的日志.

Hi, I am using Azure data factory to run spark script. If spark processing failed in between, is there any way to throw actual error to ADF and display ADF blade. currently it will show generic message. In order to see actual message whether we should go to YARN UI or check logs in storage account.

没有人知道如何实现这一目标.

Did anyone know how to achieve this.

错误活动中:Spark作业失败. BatchId =< someid.>.请在存储中找到日志,如果有GetDebugInfo 设置为始终"或失败".

Error in Activity: Spark job failed. BatchId=<someid.> Please find the log in the storage if GetDebugInfo is set to 'Always' or 'Failure'.

推荐答案

您好,

我认为无法更改UI中的错误.您可以尝试让Log Analytics警报向您发送电子邮件通知.

I don't think changing the error in the UI can be done.  You might try having a Log Analytics alert send you an email notification.


这篇关于向ADF抛出Spark作业错误的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆