如何在spark-submit命令中指定要使用的java版本? [英] How to specify which java version to use in spark-submit command?
问题描述
我想在远程服务器上的纱线群集上运行火花串流应用程序。默认的java版本是1.7,但我想为我的应用程序使用1.8,它也在服务器中,但不是默认值。有没有办法通过spark-submit指定java 1.8的位置,以便我没有得到major.minor错误?
I want to run a spark streaming application on a yarn cluster on a remote server. The default java version is 1.7 but i want to use 1.8 for my application which is also there in the server but is not the default. Is there a way to specify through spark-submit the location of java 1.8 so that i do not get major.minor error ?
推荐答案
JAVA_HOME在我们的情况下是不够的,驱动程序在java 8中运行,但我后来发现YARN中的Spark工作人员是使用java 7启动的(hadoop节点都安装了java版本)。
JAVA_HOME was not enough in our case, the driver was running in java 8, but I discovered later that Spark workers in YARN were launched using java 7 (hadoop nodes have both java version installed).
我必须在 spark-中添加
。请注意,您可以在命令行中使用 spark.executorEnv.JAVA_HOME = / usr / java /<在工作人员中可用的版本>
defaults.conf - conf
提供它。
I had to add spark.executorEnv.JAVA_HOME=/usr/java/<version available in workers>
in spark-defaults.conf
. Note that you can provide it in command line with --conf
.
请参阅 http://spark.apache.org/docs/latest/configuration.html#runtime-environment
这篇关于如何在spark-submit命令中指定要使用的java版本?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!