为什么 Spark Cassandra 连接器因 NoHostAvailableException 而失败? [英] Why does Spark Cassandra Connector fail with NoHostAvailableException?

查看:25
本文介绍了为什么 Spark Cassandra 连接器因 NoHostAvailableException 而失败?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我在 Scala 中使用 Spark Cassandra 连接器时遇到问题.

I am having problems getting Spark Cassandra Connector working in Scala.

我正在使用这些版本:

  • Scala 2.10.4
  • 火花核心 1.0.2
  • cassandra-thrift 2.1.0(我安装的 cassandra 是 v2.1.0)
  • cassandra-clientutil 2.1.0
  • cassandra-driver-core 2.0.4(推荐用于连接器?)
  • spark-cassandra-connector 1.0.0

我可以连接并与 Cassandra(无火花)通话,我可以与 Spark(无 Cassandra)通话,但连接器给了我:

I can connect and talk to Cassandra (w/o spark) and I can talk to Spark (w/o Cassandra) but the connector gives me:

com.datastax.driver.core.exceptions.NoHostAvailableException:所有主机尝试查询失败(尝试:/10.0.0.194:9042(com.datastax.driver.core.TransportException:[/10.0.0.194:9042]无法连接))

com.datastax.driver.core.exceptions.NoHostAvailableException: All host(s) tried for query failed (tried: /10.0.0.194:9042 (com.datastax.driver.core.TransportException: [/10.0.0.194:9042] Cannot connect))

我错过了什么?Cassandra 是默认安装(根据 cassandra.yaml,cql 的端口为 9042).我正在尝试本地连接(本地").

What am I missing? Cassandra is a default install (port 9042 for cql according to cassandra.yaml). I'm trying to connect locally ("local").

我的代码:

val conf = new SparkConf().setAppName("Simple Application").setMaster("local")
val sc = new SparkContext("local","test",conf)
val rdd = sc.cassandraTable("myks","users")
val rr = rdd.first
println(s"Result: $rr")

推荐答案

Local 在此上下文中指定 Spark master(告诉它以本地模式运行)而不是 Cassandra 连接主机.

Local in this context is specifying the Spark master (telling it to run in local mode) and not the Cassandra connection host.

要设置 Cassandra 连接主机,您必须在 Spark 配置中设置不同的属性

To set the Cassandra Connection host you have to set a different property in the Spark Config

import org.apache.spark._

val conf = new SparkConf(true)
        .set("spark.cassandra.connection.host", "IP Cassandra Is Listening On")
        .set("spark.cassandra.username", "cassandra") //Optional            
        .set("spark.cassandra.password", "cassandra") //Optional

val sc = new SparkContext("spark://Spark Master IP:7077", "test", conf)

https://github.com/datastax/spark-cassandra-connector/blob/master/doc/1_connecting.md

这篇关于为什么 Spark Cassandra 连接器因 NoHostAvailableException 而失败?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
相关文章
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆