如何通过sparkSession向工作人员提交多个jar? [英] How to submit multiple jars to workers through sparkSession?
问题描述
我正在使用spark 2.2.0.以下是我在Spark上用作工作的Java代码段:
I am using spark 2.2.0. Below is the java code snippet which I am using as a job on spark:
SparkSession spark = SparkSession.builder()
.appName("MySQL Connection")
.master("spark://ip:7077")
.config("spark.jars", "/path/mysql.jar")
.getOrCreate();
Dataset dataset = spark.read().format("jdbc")
.option("url", "jdbc:mysql://ip:3306/mysql")
.option("user", "superadmin")
.option("password", "****")
.option("dbtable", "account")
.load();
上面的代码可以正常工作,但是问题是,如果我需要提交2个jar,那么我不知道如何提交? config()方法仅在key('spark.jars')中接受一个参数,在value(jar的路径)中接受一个参数.如果使用SparkConfig().setJars(),我知道如何发送多个jar,但我不知道如何使用,因为我需要使用SparkSession.
The above code works perfectly but the problem is that if I need to submit 2 jars then I dont know how to submit it? The config() method accepts only one parameter in key('spark.jars') and one in value(path to jar). I know how to send multiple jars if used SparkConfig().setJars() but not of my use since I need to use SparkSession.
有人可以帮忙吗?
推荐答案
如 spark提交中所述,在类路径中添加多个jars 和通过spark-submit传递其他jar到Spark ,则应使用逗号分隔列表:
As explained in spark submit add multiple jars in classpath and Passing additional jars to Spark via spark-submit you should use comma separated list:
SparkSession spark = SparkSession.builder()
.appName("MySQL Connection")
.master("spark://ip:7077")
.config("spark.jars", "/path/mysql.jar,/path/to/another.jar")
.getOrCreate();
如果使用SparkConfig().setJars(),我知道如何发送多个jar,但我不知道如何发送,因为我需要使用SparkSession.
I know how to send multiple jars if used SparkConfig().setJars() but not of my use since I need to use SparkSession.
SparkConf
查看全文