如何加载星火卡桑德拉连接器的外壳? [英] How to load Spark Cassandra Connector in the shell?

查看:255
本文介绍了如何加载星火卡桑德拉连接器的外壳?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我试图使用星火卡桑德拉连接器在星火1.1.0

我已经成功地构建从GitHub上的主分支jar文件,并已经得到了包括演示工作。然而,当我尝试将jar文件加载到火花壳我无法导入任何从 com.datastax.spark类的.connector 包。

I have successfully built the jar file from the master branch on GitHub and have gotten the included demos to work. However, when I try to load the jar files into the spark-shell I can't import any of the classes from the com.datastax.spark.connector package.

我已经使用尝试 - 罐子火花shell选项和添加具有jar文件的目录Java的CLASSPATH。没有这些选项的作用。事实上,当我使用 - 罐子选项,日志输出显示Datastax罐子是越来越加载,但是我仍然无法从输入任何东西com.datastax

I have tried using the --jars option on spark-shell and adding the directory with the jar file to Java's CLASSPATH. Neither of these options work. In fact, when I use the --jars option, the logging output shows that the Datastax jar is getting loaded, but I still cannot import anything from com.datastax.

我已经能够使用 来的Tuplejump汽笛风琴卡桑德拉连接器加载到火花壳 - 罐子,所以我知道的工作。这只是其中的失败对我来说Datastax连接器。

I have been able to load the Tuplejump Calliope Cassandra connector into the spark-shell using --jars, so I know that's working. It's just the Datastax connector which is failing for me.

推荐答案

我知道了。下面是我所做的:

I got it. Below is what I did:

$ git clone https://github.com/datastax/spark-cassandra-connector.git
$ cd spark-cassandra-connector
$ sbt/sbt assembly
$ $SPARK_HOME/bin/spark-shell --jars ~/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.10/connector-assembly-1.2.0-SNAPSHOT.jar 

在斯卡拉提示,

scala> sc.stop
scala> import com.datastax.spark.connector._
scala> import org.apache.spark.SparkContext
scala> import org.apache.spark.SparkContext._
scala> import org.apache.spark.SparkConf
scala> val conf = new SparkConf(true).set("spark.cassandra.connection.host", "my cassandra host")
scala> val sc = new SparkContext("spark://spark host:7077", "test", conf)

这篇关于如何加载星火卡桑德拉连接器的外壳?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆