为什么Spark会以exitCode:16退出? [英] Why does Spark exit with exitCode: 16?

查看:276
本文介绍了为什么Spark会以exitCode:16退出?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我将Spark 2.0.0与Hadoop 2.7结合使用,并使用yarn-cluster模式.每次,都会出现以下错误:

I am using Spark 2.0.0 with Hadoop 2.7 and use the yarn-cluster mode. Every time, I get the following error:

17/01/04 11:18:04 INFO spark.SparkContext: Successfully stopped SparkContext
17/01/04 11:18:04 INFO yarn.ApplicationMaster: Final app status: FAILED, exitCode: 16, (reason: Shutdown hook called before final status was reported.)
17/01/04 11:18:04 INFO util.ShutdownHookManager: Shutdown hook called
17/01/04 11:18:04 INFO util.ShutdownHookManager: Deleting directory /tmp/hadoop-hduser/nm-local-dir/usercache/harry/appcache/application_1475261544699_0833/spark-42e40ac3-279f-4c3f-ab27-9999d20069b8
17/01/04 11:18:04 INFO spark.SparkContext: SparkContext already stopped.

但是,我确实获得了正确的打印输出.相同的代码在Spark 1.4.0-Hadoop 2.4.0中可以正常工作,在此我看不到任何退出代码.

However, I do get the correct printed output. The same code works fine in Spark 1.4.0-Hadoop 2.4.0 where I do not see any exit codes.

推荐答案

如果应用程序退出错误,此问题 .sparkStaging无法清除

This issue .sparkStaging not cleaned if application exited incorrectly https://issues.apache.org/jira/browse/SPARK-17340 started after Spark 1.4 (Affects Version/s: 1.5.2, 1.6.1, 2.0.0)

问题是:运行Spark(纱线,群集模式)并杀死应用程序时.sparkStaging未清理.

The issue is: When running Spark (yarn,cluster mode) and killing application .sparkStaging is not cleaned.

发生此问题时,Spark 2.0.X中引发了exitCode 16

When this issue happened exitCode 16 raised in Spark 2.0.X

ERROR ApplicationMaster: RECEIVED SIGNAL TERM
INFO ApplicationMaster: Final app status: FAILED, exitCode: 16, (reason: Shutdown hook called before final status was reported.)

在您的代码中是否有某些东西正在杀死应用程序?如果是这样,则它不应在Spark 1.4中看到,而应在Spark 2.0.0中看到

Is it possible that in your code, something is killing the application? If so - it shouldn't be seen in Spark 1.4, but should be seen in Spark 2.0.0

请在代码中搜索退出"(好像您的代码中有此错误一样,该错误不会在Spark 1.4中显示,但会在Spark 2.0.0中显示)

Please search your code for "exit" (as if you have such in your code, the error won't be shown in Spark 1.4, but will in Spark 2.0.0)

这篇关于为什么Spark会以exitCode:16退出?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆