Spark 1.5.0 spark.app.id警告 [英] Spark 1.5.0 spark.app.id warning

查看:129
本文介绍了Spark 1.5.0 spark.app.id警告的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我已经更新了CDH群集,以使用 spark 1.5.0 .当我提交Spark应用程序时,系统显示有关 spark.app.id

I had updated my CDH cluster to use spark 1.5.0. When I submit spark application, the system show warning about spark.app.id

Using default name DAGScheduler for source because spark.app.id is not set.

我搜索了有关 spark.app.id 的信息,但没有有关它的文档.我阅读了此链接,我认为它用于RestAPI调用.

I have searched about spark.app.id but not document about it. I read this link and I think It is used for RestAPI call.

我在 spark 1.4 中没有看到此警告.有人可以向我解释一下并显示如何设置吗?

I don't see this warning in spark 1.4. Could someone explain it to me and show how to set it?

推荐答案

它不一定用于REST API,而是用于监视目的.g当您想按示例检查纱线记录时:

It's not necessarily used for the REST API, but rather for monitoring purpose e. g when you want to check yarn logs per example:

yarn logs <spark.app.id>

确实,此特定问题尚未记录.我认为已添加它以标准化Hadoop生态系统中的应用程序部署.

It's true that this specific issue is still not documented yet. I think it's been added to standardize the application deployment within the Hadoop ecosystem.

我建议您在应用中设置"spark.app.id".

I suggest that you set the 'spark.app.id' in your app.

conf.set("spark.app.id", <app-id>) // considering that you already have a SparkConf defined of course

尽管如此,这仍然是警告,不会影响应用程序本身.

Nevertheless, this remains a warning which won't effect the application itself.

这篇关于Spark 1.5.0 spark.app.id警告的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆