spark通过java代码提交 [英] spark-submit through java code
问题描述
但是我得到了
构造函数ClientArguments(String [],SparkConf)未定义
这是我的代码。
import org.apache。 spark.deploy.yarn.Client;
import org.apache.spark.deploy.yarn.ClientArguments;
导入org.apache.hadoop.conf.Configuration;
import org.apache.spark.SparkConf;
$ b $ public class SparkSubmitJava {
public static void main(String [] arguments)throws Exception {
String [] args = new String [] {--name, myname, - jar,/home/cloudera/Desktop/ScalaTest.jar,--class,ScalaTest.ScalaTest.ScalaTest,--arg,3,--arg ,纱线丛};
配置config = new Configuration();
System.setProperty(SPARK_YARN_MODE,true);
SparkConf sparkConf = new SparkConf();
ClientArguments cArgs =新的ClientArguments(args,sparkConf); //获取构造函数错误
客户端客户端=新客户端(cArgs,config,sparkConf); //获取构造函数错误
client.run();
我的pom.xml依赖部分:
< dependency>
< groupId> org.apache.spark< / groupId>
< artifactId> spark-core_2.10< / artifactId>
< version> 1.3.0< / version>
< /依赖关系>
< dependency>
< groupId> org.apache.spark< / groupId>
< artifactId> spark-yarn_2.9.3< / artifactId>
< version> 0.8.1-incubating< / version>
< /依赖关系>
蚂蚁的帮助将会受到欢迎。
考虑到你已经从你的pom.xml中共享了什么,这里是你的问题:你正在使用一个非常旧的spark-yarn库 0.8.1-孵化
你需要用相应的版本替换为spark-core。由于您使用的是Spark 1.3,因此您需要的是依赖项,而不是您正在使用的依赖项:
<依赖性>
< groupId> org.apache.spark< / groupId>
< artifactId> spark-yarn_2.10< / artifactId>
< version> 1.3.0< / version>
< /依赖关系>
其次,您使用scala-wise的不兼容版本的库。请注意, _2.10
和 _2.9.3
非常重要。它们允许你使用每个依赖的特定scala编译版本,所以你应该小心。
I am trying spark-submit through Java code. I am referring the following example.
But I am getting
The constructor ClientArguments(String[], SparkConf) is undefined
This is my code.
import org.apache.spark.deploy.yarn.Client;
import org.apache.spark.deploy.yarn.ClientArguments;
import org.apache.hadoop.conf.Configuration;
import org.apache.spark.SparkConf;
public class SparkSubmitJava {
public static void main(String[] arguments) throws Exception {
String[] args = new String[] {"--name", "myname", "--jar", "/home/cloudera/Desktop/ScalaTest.jar", "--class", "ScalaTest.ScalaTest.ScalaTest", "--arg","3", "--arg", "yarn-cluster"};
Configuration config = new Configuration();
System.setProperty("SPARK_YARN_MODE", "true");
SparkConf sparkConf = new SparkConf();
ClientArguments cArgs = new ClientArguments(args, sparkConf); // getting constructor error
Client client = new Client(cArgs, config, sparkConf); // getting constructor error
client.run();
}
}
my pom.xml dependency section :
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.3.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-yarn_2.9.3</artifactId>
<version>0.8.1-incubating</version>
</dependency>
Ant help will be appreciated.
Considering what you have shared from your pom.xml, here is your problem : You are using a very old version of the spark-yarn library 0.8.1-incubating
which you need to replace with the corresponding version to spark-core. Since you are using Spark 1.3, this is the dependency you'll be needing the following instead of the one you are using:
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-yarn_2.10</artifactId>
<version>1.3.0</version>
</dependency>
Secondly you are using incompatible version of libraries scala-wise. Note that the _2.10
and _2.9.3
are very important. They allow you to use a specific scala compiled version of each dependency, so you should be careful to that.
这篇关于spark通过java代码提交的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!