如何使用spark-submit的--properties-file选项在IntelliJ IDEA中启动Spark应用程序? [英] How to use spark-submit's --properties-file option to launch Spark application in IntelliJ IDEA?

查看:613
本文介绍了如何使用spark-submit的--properties-file选项在IntelliJ IDEA中启动Spark应用程序?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在使用使用Scala和IntelliJ IDE开发的Spark来启动项目.

I'm starting a project using Spark developed with Scala and IntelliJ IDE.

我想知道如何在IntelliJ配置中使用Spark的特定配置设置-- properties-file.

I was wondering how to set -- properties-file with specific configuration of Spark in IntelliJ configuration.

我正在读取这样的配置"param1" -> sc.getConf.get("param1")

I'm reading configuration like this "param1" -> sc.getConf.get("param1")

当我从命令行执行Spark作业时,它就像一个超级按钮: /opt/spark/bin/spark-submit --class "com.class.main" --master local --properties-file properties.conf ./target/scala-2.11/main.jar arg1 arg2 arg3 arg4

When I execute Spark job from command line works like a charm: /opt/spark/bin/spark-submit --class "com.class.main" --master local --properties-file properties.conf ./target/scala-2.11/main.jar arg1 arg2 arg3 arg4

问题是当我使用VM Options使用IntelliJ Run Configuration执行作业时:

The problem is when I execute job using IntelliJ Run Configuration using VM Options:

  1. 我成功使用--master参数作为-Dspark.master=local
  2. 我成功使用--conf参数作为-Dspark.param1=value1
  3. 失败,并且--properties-file
  1. I succeed with --master param as -Dspark.master=local
  2. I succeed with --conf params as -Dspark.param1=value1
  3. I failed with --properties-file

有人可以指出正确的设置方式吗?

Can anyone point me at the right way to set this up?

推荐答案

认为可以使用--properties-file从IntelliJ IDEA内部启动Spark应用程序.

I don't think it's possible to use --properties-file to launch a Spark application from within IntelliJ IDEA.

spark-submit是用于提交Spark应用程序以执行的Shell脚本,在为Spark应用程序创建适当的提交环境之前,它不需要做任何额外的事情.

spark-submit is the shell script to submit Spark application for execution and does few extra things before it creates a proper submission environment for the Spark application.

不过,您可以利用Spark应用程序默认加载的conf/spark-defaults.conf来模仿--properties-file的行为.

You can however mimic the behaviour of --properties-file by leveraging conf/spark-defaults.conf that a Spark application loads by default.

您可以在src/test/resources(或src/main/resources)下使用内容properties.conf创建一个conf/spark-defaults.conf.那是应该起作用的.

You could create a conf/spark-defaults.conf under src/test/resources (or src/main/resources) with the content of properties.conf. That is supposed to work.

这篇关于如何使用spark-submit的--properties-file选项在IntelliJ IDEA中启动Spark应用程序?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆