如何在使用JAR运行spark-submit时将程序参数传递给main函数? [英] How do I pass program-argument to main function in running spark-submit with a JAR?

查看:1246
本文介绍了如何在使用JAR运行spark-submit时将程序参数传递给main函数?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我知道这是一个微不足道的问题,但我无法在互联网上找到答案。

I know this is a trivial question, but I could not find the answer on the internet.

我正在尝试使用<$ c运行Java类带有程序参数的$ c> main 函数( String [] args )。

I am trying to run a Java class with the main function with program arguments (String[] args).

但是,当我使用 spark-submit 提交作业并按照我的方式传递程序参数时,

However, when I submit the job using spark-submit and pass program arguments as I would do with

java -cp <some jar>.jar <Some class name> <arg1> <arg2>

它不会读取 arg s。

我试过的命令是

bin/spark-submit analytics-package.jar --class full.package.name.ClassName 1234 someargument someArgument

这给出了

Error: No main class set in JAR; please specify one with --class

当我尝试时:

bin/spark-submit --class full.package.name.ClassName 1234 someargument someArgument analytics-package.jar 

我得到

Warning: Local jar /mnt/disk1/spark/1 does not exist, skipping.
java.lang.ClassNotFoundException: com.relcy.analytics.query.QueryAnalytics
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    at java.lang.Class.forName0(Native Method)
    at java.lang.Class.forName(Class.java:348)
    at org.apache.spark.util.Utils$.classForName(Utils.scala:176)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:693)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:183)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:208)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:122)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

如何传递这些参数?它们在每次运行时都经常更改,并且需要作为参数传递。

How can I pass these arguments? They change frequently on each run of the job, and they need to be passed as arguments.

推荐答案

参数传递之前 .jar文件将是JVM的参数,之后的参数将jar文件传递给用户程序。

Arguments passed before the .jar file will be arguments to the JVM, where are arguments after the jar file will be pass on to the users program.

bin/spark-submit --class classname -Xms256m -Xmx1g something.jar someargument

此处, s 将等于 someargument ,其中 - Xms -Xmx 被传递到JVM。

Here, s will equal someargument, where the -Xms -Xmx are passed into the JVM.

public static void main(String[] args) {

    String s = args[0];
}

这篇关于如何在使用JAR运行spark-submit时将程序参数传递给main函数?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆