如何使用Apache Livy设置Spark配置属性? [英] How to set Spark configuration properties using Apache Livy?

查看:760
本文介绍了如何使用Apache Livy设置Spark配置属性?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

在将Spark作业提交给Apache Livy时,我不知道如何以编程方式传递SparkSession参数:

I don't know how to pass SparkSession parameters programmatically when submitting Spark job to Apache Livy:

这是Test Spark作业:

This is the Test Spark job:

class Test extends Job[Int]{

  override def call(jc: JobContext): Int = {

    val spark = jc.sparkSession()

    // ...

  }
}

这是将Spark作业提交给Livy的方式:

This is how this Spark job is submitted to Livy:

val client = new LivyClientBuilder()
  .setURI(new URI(livyUrl))
  .build()

try {
  client.uploadJar(new File(testJarPath)).get()

  client.submit(new Test())

} finally {
  client.stop(true)
}

如何将以下配置参数传递给SparkSession?

How can I pass the following configuration parameters to SparkSession?

  .config("es.nodes","1localhost")
  .config("es.port",9200)
  .config("es.nodes.wan.only","true")
  .config("es.index.auto.create","true")

推荐答案

您可以像这样通过LivyClientBuilder轻松地做到这一点:

You can do that easily through the LivyClientBuilder like this:

val client = new LivyClientBuilder()
  .setURI(new URI(livyUrl))
  .setConf("es.nodes","1localhost")
  .setConf("key", "value")
  .build()

这篇关于如何使用Apache Livy设置Spark配置属性?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆