无法从Spark连接到Cassandra(联系点包含多个数据中心) [英] Cannot connect to Cassandra from Spark (Contact points contain multiple data centers)
本文介绍了无法从Spark连接到Cassandra(联系点包含多个数据中心)的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!
问题描述
我正在尝试运行我的第一个spark作业(访问Cassandra的Scala作业),该作业失败并显示以下错误:
I am trying to run my first spark job (a Scala job that accesses Cassandra) which is failing and showing the following error :
java.io.IOException: Failed to open native connection to Cassandra at {<ip>}:9042
at com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:164)
at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$2.apply(CassandraConnector.scala:150)
at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$2.apply(CassandraConnector.scala:150)
at com.datastax.spark.connector.cql.RefCountedCache.createNewValueAndKeys(RefCountedCache.scala:31)
...........
............
Caused by: java.lang.IllegalArgumentException: Contact points contain multiple data centers:
at com.datastax.spark.connector.cql.LocalNodeFirstLoadBalancingPolicy.init(LocalNodeFirstLoadBalancingPolicy.scala:47)
at com.datastax.driver.core.Cluster$Manager.init(Cluster.java:1099)
at com.datastax.driver.core.Cluster.getMetadata(Cluster.java:271)
at com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:157)
我们在做什么错了?
我正在使用:
- 火花1.5.2
- Apache Cassandra 2.1.10
- spark-cassandra连接器1.3.1 /1.5.0-M2(尝试了两个连接器)
- Scala版本2.10.4
- Spark 1.5.2
- Apache Cassandra 2.1.10
- spark-cassandra connector 1.3.1 /1.5.0-M2 (tried both connectors)
- Scala version 2.10.4
推荐答案
->据作者称,正在解决此问题.请参阅此答案下方的评论.
--> According to author there a work in progress to fix this. See comments below this answer.
我在文档中找到了它,希望对您有所帮助:
I found this in the documentation, I hope it will help you :
override def init(cluster: Cluster, hosts: JCollection[Host]) {
nodes = hosts.toSet
// use explicitly set DC if available, otherwise see if all contact points have same DC
// if so, use that DC; if not, throw an error
dcToUse = localDC match {
case Some(local) => local
case None =>
val dcList = dcs(nodesInTheSameDC(contactPoints, hosts.toSet))
if (dcList.size == 1)
dcList.head
else
throw new IllegalArgumentException(s"Contact points contain multiple data centers: ${dcList.mkString(", ")}")
}
clusterMetadata = cluster.getMetadata
}
这篇关于无法从Spark连接到Cassandra(联系点包含多个数据中心)的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!
查看全文