为什么星火卡桑德拉连接失败,NoHostAvailableException? [英] Why does Spark Cassandra Connector fail with NoHostAvailableException?

查看:185
本文介绍了为什么星火卡桑德拉连接失败,NoHostAvailableException?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有越来越星火卡桑德拉连接器在Scala中工作的问题。

我使用这些版本:


  • 斯卡拉2.10.4

  • 火花核心1.0.2

  • 卡桑德拉 - 节俭2.1.0(我装Cassandra是V2.1.0)

  • 卡桑德拉 - clientutil 2.1.0

  • 卡桑德拉驱动核心2.0.4(推荐用于连接器?)

  • 火花卡桑德拉连接器1.0.0

我可以连接和交谈卡桑德拉(W / O火花),我可以跟星火(W / O卡珊德拉),但连接器给我:


  

com.datastax.driver.core.exceptions.NoHostAvailableException:所有的主机(S)试了查询失败(尝试:/10.0.0.194:9042(com.datastax.driver.core.TransportException:[/10.0.0.194: 9042]无法连接))


我在想什么? Cassandra是一个默认的(根据cassandra.yaml端口9042的CQL)安装。我试图连接本地(本地)。

我的code:

  VAL的conf =新SparkConf()。setAppName(简单应用程序)setMaster。(本地)
VAL SC =新SparkContext(本地,测试,CONF)
VAL RDD = sc.cassandraTable(myks,用户)
VAL RR = rdd.first
的println(S结果:$ RR)


解决方案

地方在这方面的规定星火主(告诉它以本地模式运行),而不是卡桑德拉连接主机。

要设置卡桑德拉连接主机你必须设置在Spark不同的属性配置

 进口org.apache.spark._VAL的conf =新SparkConf(真)
        .SET(spark.cassandra.connection.host,IP卡珊德拉正在侦听)
        .SET(spark.cassandra.username,卡桑德拉)//可选
        .SET(spark.cassandra.password,卡桑德拉)//可选VAL SC =新SparkContext(火花://星火主机IP:7077,测试,CONF)

https://github.com/datastax/spark-cassandra-connector/blob/master/doc/1_connecting.md

I am having problems getting Spark Cassandra Connector working in Scala.

I'm using these versions:

  • Scala 2.10.4
  • spark-core 1.0.2
  • cassandra-thrift 2.1.0 (my installed cassandra is v2.1.0)
  • cassandra-clientutil 2.1.0
  • cassandra-driver-core 2.0.4 (recommended for connector?)
  • spark-cassandra-connector 1.0.0

I can connect and talk to Cassandra (w/o spark) and I can talk to Spark (w/o Cassandra) but the connector gives me:

com.datastax.driver.core.exceptions.NoHostAvailableException: All host(s) tried for query failed (tried: /10.0.0.194:9042 (com.datastax.driver.core.TransportException: [/10.0.0.194:9042] Cannot connect))

What am I missing? Cassandra is a default install (port 9042 for cql according to cassandra.yaml). I'm trying to connect locally ("local").

My code:

val conf = new SparkConf().setAppName("Simple Application").setMaster("local")
val sc = new SparkContext("local","test",conf)
val rdd = sc.cassandraTable("myks","users")
val rr = rdd.first
println(s"Result: $rr")

解决方案

Local in this context is specifying the Spark master (telling it to run in local mode) and not the Cassandra connection host.

To set the Cassandra Connection host you have to set a different property in the Spark Config

import org.apache.spark._

val conf = new SparkConf(true)
        .set("spark.cassandra.connection.host", "IP Cassandra Is Listening On")
        .set("spark.cassandra.username", "cassandra") //Optional            
        .set("spark.cassandra.password", "cassandra") //Optional

val sc = new SparkContext("spark://Spark Master IP:7077", "test", conf)

https://github.com/datastax/spark-cassandra-connector/blob/master/doc/1_connecting.md

这篇关于为什么星火卡桑德拉连接失败,NoHostAvailableException?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆