安装卡桑德拉火花连接器 [英] Installing cassandra spark connector

查看:529
本文介绍了安装卡桑德拉火花连接器的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

  https://github.com/datastax/spark-cassandra-connector
http://spark-packages.org/package/datastax/spark-cassandra-connector

我做的命令,但在最后它看起来像有错误。这些是致命的还是我需要解决他们?

  [IDF @节点1斌] $火花壳--packages datastax:火花卡桑德拉连接器:1.6.0-M1-s_2.11
常春藤默认缓存设置为:/home/idf/.ivy2/cache
存储在包的罐子:/home/idf/.ivy2/jars
::加载设置:: URL = jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
datastax#火花卡桑德拉连接器添加为依赖
::解决依赖:: org.apache.spark#火花提交父母; 1.0
        confs:[默认]
        发现datastax#火花卡桑德拉连接器; 1.6.0-M1-s_2.11火花,包
        发现org.apache.cassandra#卡桑德拉 - clientutil; 3.0.2在中央
        发现com.datastax.cassandra#卡桑德拉驱动核心; 3.0.0在中央
        发现io.netty#网状处理程序; 4.0.33.Final中部
        发现io.netty#网状缓冲; 4.0.33.Final中部
        发现io.netty#网状常见; 4.0.33.Final中部
        发现io.netty#网状运输; 4.0.33.Final中部
        发现io.netty#netty- codeC; 4.0.33.Final中部
        发现io.dropwizard.metrics#指标核心; 3.1.2列表
        发现org.slf4j#SLF4J的API; 1.7.7在中央
        发现org.apache.commons#公地lang3; 3.3.2列表
        发现com.google.guava#番石榴; 16.0.1中部
        发现org.joda#乔达-转换; 1.2中部
        发现乔达时间#乔达时间; 2.3中部
        发现com.twitter#jsr166e; 1.1.0在中央
        发现org.scala琅#斯卡拉反省; 2.11.7在列表
        [2.11.7] org.scala琅#斯卡拉-反映; 2.11.7
下载http://dl.bintray.com/spark-packages/maven/datastax/spark-cassandra-connector/1.6.0-M1-s_2.11/spark-cassandra-connector-1.6.0-M1-s_2.11.jar ...
        [成功] datastax#火花卡桑德拉连接器; 1.6.0-M1-s_2.11火花卡桑德拉 - connector.jar(2430ms)!
下载https://repo1.maven.org/maven2/org/apache/cassandra/cassandra-clientutil/3.0.2/cassandra-clientutil-3.0.2.jar ...
        [成功] org.apache.cassandra#卡桑德拉 - clientutil; 3.0.2卡桑德拉 - clientutil.jar(195ms)!
下载https://repo1.maven.org/maven2/com/datastax/cassandra/cassandra-driver-core/3.0.0/cassandra-driver-core-3.0.0.jar ...
        [成功] com.datastax.cassandra#卡桑德拉驱动核心; 3.0.0卡桑德拉驱动-core.jar添加(包)(874ms)!
下载https://repo1.maven.org/maven2/com/google/guava/guava/16.0.1/guava-16.0.1.jar ...
        [成功] com.google.guava#番石榴; 16.0.1 guava.jar(包)(1930ms)!
下载https://repo1.maven.org/maven2/org/joda/joda-convert/1.2/joda-convert-1.2.jar ...
        [成功] org.joda#乔达-转换;!1.2乔达-convert.jar(68ms)
下载https://repo1.maven.org/maven2/joda-time/joda-time/2.3/joda-time-2.3.jar ...
        [成功]乔达时间#乔达时间;!2.3乔达-time.jar(524ms)
下载https://repo1.maven.org/maven2/com/twitter/jsr166e/1.1.0/jsr166e-1.1.0.jar ...
        [成功] com.twitter#jsr166e; 1.1.0 jsr166e.jar(138ms)!
下载https://repo1.maven.org/maven2/io/netty/netty-handler/4.0.33.Final/netty-handler-4.0.33.Final.jar ...
        [成功] io.netty#网状处理程序;!4.0.33.Final网状-handler.jar(266ms)
下载https://repo1.maven.org/maven2/io/netty/netty-buffer/4.0.33.Final/netty-buffer-4.0.33.Final.jar ...
        [成功] io.netty#网状缓冲;!4.0.33.Final网状-buffer.jar(202ms)
下载https://repo1.maven.org/maven2/io/netty/netty-transport/4.0.33.Final/netty-transport-4.0.33.Final.jar ...
        [成功] io.netty#网状运输;!4.0.33.Final网状-transport.jar(330ms)
下载https://repo1.maven.org/maven2/io/netty/netty-$c$cc/4.0.33.Final/netty-$c$cc-4.0.33.Final.jar ...
        [成功] io.netty#netty- codeC的!4.0.33.Final netty- codec.jar(157ms)
下载https://repo1.maven.org/maven2/io/netty/netty-common/4.0.33.Final/netty-common-4.0.33.Final.jar ...
        [成功] io.netty#网状常见;!4.0.33.Final网状-common.jar(409ms)
下载https://repo1.maven.org/maven2/org/slf4j/slf4j-api/1.7.7/slf4j-api-1.7.7.jar ...
        [成功] org.slf4j#SLF4J的API;!1.7.7加入slf4j-api.jar文件(57ms)
::分辨率报告::决心5827ms ::文物DL 7749ms
        ::在使用的模块:
        com.datastax.cassandra#卡桑德拉驱动核心; 3.0.0中央在[默认]
        com.google.guava#番石榴; 16.0.1中央在[默认]
        com.twitter#jsr166e; 1.1.0中央在[默认]
        datastax#火花卡桑德拉连接器;从火花软件包1.6.0-M1-s_2.11 [默认]
        io.dropwizard.metrics#指标核心; 3.1.2列表[默认]
        io.netty#网状缓冲; 4.0.33.Final中央在[默认]
        io.netty#netty- codeC; 4.0.33.Final中央在[默认]
        io.netty#网状常见; 4.0.33.Final中央在[默认]
        io.netty#网状处理程序; 4.0.33.Final中央在[默认]
        io.netty#网状运输; 4.0.33.Final中央在[默认]
        乔达时间#乔达时间; 2.3中央在[默认]
        org.apache.cassandra#卡桑德拉 - clientutil; 3.0.2中央在[默认]
        org.apache.commons#公地lang3; 3.3.2从列表中选择[默认]
        org.joda#乔达-转换; 1.2从中部的[默认]
        org.scala琅#斯卡拉-反映;从列表2.11.7 [默认]
        org.slf4j#SLF4J的API; 1.7.7中央在[默认]
        -------------------------------------------------- -------------------
        | |模块||文物|
        | CONF |号|搜索| dwnlded |驱逐||号| dwnlded |
        -------------------------------------------------- -------------------
        |默认| 16 | 13 | 13 | 0 || 16 | 13 |
        -------------------------------------------------- -------------------::问题总结::
::::错误
        未知解析SBT链        未知解析器空        未知解析SBT链        未知解析器空        未知解析SBT链        未知解析器空        未知解析SBT链        未知解析器空        未知解析器空        未知解析SBT链        未知解析器空
::使用更多细节VERBOSE或调试消息级别
::检索:: org.apache.spark#火花提交父
        confs:[默认]
        16文物复制,0已检索(12730kB / 549ms)
16/04/08 14点48分十九秒WARN本土codeLoader:无法加载原生的Hadoop库平台...使用内置-java类适用
欢迎来到
      ____ __
     / __ / __ ___ _____ / / __
    _ \\ \\ / _ \\ / _`/ __ /'_ /
   / ___ / .__ / \\ _,_ / _ / / _ / \\ _ \\ 1.6.1版
      / _ /使用Scala版本2.10.5(Java的热点(TM)64位服务器VM,爪哇1.8.0_45)
在EX pressions类型,让他们评估。
类型:帮助更多的信息。
错误:坏的符号引用。在package.class签名是指输入compileTimeOnly
在包scala.annotation这是不可用的。
它可能从目前的类路径,或在版本完全缺失
类路径可能与编译package.class时所使用的版本不兼容。
<&控制台GT;:14:错误:参考值SC不应该存活过去的类型检查,
它应该已经处理和一个围绕宏膨胀期间消除。
                @Transient VAL SC = {
                               ^
<&控制台GT;:15:错误:参照方法createSQLContext类SparkILoop不应该存活过去的类型检查,
它应该已经处理和一个围绕宏膨胀期间消除。
                  VAL _sqlContext = org.apache.spark.repl.Main.interp.createSQLContext()
                                                                      ^
<&控制台GT;:14:错误:参考重视sqlContext不应该存活过去的类型检查,
它应该已经处理和一个围绕宏膨胀期间消除。
                @Transient VAL sqlContext = {
                               ^
<&控制台GT;:16:错误:未找到:值sqlContext
         进口sqlContext.implicits._
                ^
<&控制台GT;:16:错误:未找到:值sqlContext
         进口sqlContext.sql
                ^斯卡拉>

修改1

在选择正确的Scala版本,它似乎变得更远,但我不确定的,如果下面的输出还是有什么似乎是需要解决的错误:

  [IDF @节点1斌] $火花壳--packages datastax:火花卡桑德拉连接器:1.6.0-M1-s_2.10
常春藤默认缓存设置为:/home/idf/.ivy2/cache
存储在包的罐子:/home/idf/.ivy2/jars
::加载设置:: URL =的jar:文件:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar /组织/ APACH E /常春藤/core/settings/ivysettings.xml
datastax#火花卡桑德拉连接器添加为依赖
::解决依赖:: org.apache.spark#火花提交父母; 1.0
        confs:[默认]
        发现datastax#火花卡桑德拉连接器; 1.6.0-M1-s_2.10火花,包
        发现org.apache.cassandra#卡桑德拉 - clientutil; 3.0.2在中央
        发现com.datastax.cassandra#卡桑德拉驱动核心; 3.0.0在中央
        发现io.netty#网状处理程序; 4.0.33.Final中部
        发现io.netty#网状缓冲; 4.0.33.Final中部
        发现io.netty#网状常见; 4.0.33.Final中部
        发现io.netty#网状运输; 4.0.33.Final中部
        发现io.netty#netty- codeC; 4.0.33.Final中部
        发现io.dropwizard.metrics#指标核心; 3.1.2列表
        发现org.slf4j#SLF4J的API; 1.7.7在中央
        发现org.apache.commons#公地lang3; 3.3.2列表
        发现com.google.guava#番石榴; 16.0.1中部
        发现org.joda#乔达-转换; 1.2中部
        发现乔达时间#乔达时间; 2.3中部
        发现com.twitter#jsr166e; 1.1.0在中央
        发现org.scala琅#斯卡拉反省; 2.10.5在列表
下载http://dl.bintray.com/spark-packages/maven/datastax/spark-cassandra-connector/1.6.0-M1-s_2.10/spark-cassandr一个连接器-1.6.0-M1-s_2.10.jar ...
        [成功] datastax#火花卡桑德拉连接器; 1.6.0-M1-s_2.10火花卡桑德拉 - connector.jar(2414ms)!
::分辨率报告::决心3281ms ::文物DL 2430ms
        ::在使用的模块:
        com.datastax.cassandra#卡桑德拉驱动核心; 3.0.0中央在[默认]
        com.google.guava#番石榴; 16.0.1中央在[默认]
        com.twitter#jsr166e; 1.1.0中央在[默认]
        datastax#火花卡桑德拉连接器;从火花软件包1.6.0-M1-s_2.10 [默认]
        io.dropwizard.metrics#指标核心; 3.1.2列表[默认]
        io.netty#网状缓冲; 4.0.33.Final中央在[默认]
        io.netty#netty- codeC; 4.0.33.Final中央在[默认]
        io.netty#网状常见; 4.0.33.Final中央在[默认]
        io.netty#网状处理程序; 4.0.33.Final中央在[默认]
        io.netty#网状运输; 4.0.33.Final中央在[默认]
        乔达时间#乔达时间; 2.3中央在[默认]
        org.apache.cassandra#卡桑德拉 - clientutil; 3.0.2中央在[默认]
        org.apache.commons#公地lang3; 3.3.2从列表中选择[默认]
        org.joda#乔达-转换; 1.2从中部的[默认]
        org.scala琅#斯卡拉-反映;从列表2.10.5 [默认]
        org.slf4j#SLF4J的API; 1.7.7中央在[默认]
        -------------------------------------------------- -------------------
        | |模块||文物|
        | CONF |号|搜索| dwnlded |驱逐||号| dwnlded |
        -------------------------------------------------- -------------------
        |默认| 16 | 6 | 6 | 0 || 16 | 1 |
        -------------------------------------------------- -------------------::问题总结::
::::错误
        未知解析器空        未知解析SBT链        未知解析器空
::使用更多细节VERBOSE或调试消息级别
::检索:: org.apache.spark#火花提交父
        confs:[默认]
        2文物复制,14已经恢复(5453kB / 69ms)
16/04/08 15时50分二十秒WARN本土codeLoader:无法加载原生的Hadoop库平台...使用内置Java的CL驴适用
欢迎来到
      ____ __
     / __ / __ ___ _____ / / __
    _ \\ \\ / _ \\ / _`/ __ /'_ /
   / ___ / .__ / \\ _,_ / _ / / _ / \\ _ \\ 1.6.1版
      / _ /使用Scala版本2.10.5(Java的热点(TM)64位服务器VM,爪哇1.8.0_45)
在EX pressions类型,让他们评估。
类型:帮助更多的信息。
作为SC星火上下文。
16/04/08 15时50分28秒WARN一般:插件(包)org.datanucleus已注册。确保你没有在classpath同一插件的多个ĴAR版本。该URL的file:/opt/spark-latest/lib/datanucleus-core-3.2.10.jar是ALR伊迪注册,而你试图注册位于URL相同的插件的文件:/opt/spark-1.6 .1彬hadoop2.6 / lib目录/ DataNucleus的核心-3.2.10.jar。
16/04/08 15时50分28秒WARN一般:插件(包)org.datanucleus.store.rdbms已注册。确保你不要倍感classpath中相同的插件的多个版本的jar。该URL的file:/opt/spark-latest/lib/datanucleus-rdbms-3.2.9的.jar已经注册了,你尝试注册位于URL的文件相同的插件:/opt/spark-1.6。 1-BI正hadoop2.6 / lib目录/ DataNucleus的-RDBMS-3.2.9.jar。
16/04/08 15时50分28秒WARN一般:插件(包)org.datanucleus.api.jdo已注册。确保你没有在classpath同一插件亩ltiple JAR版本。该URL的file:/opt/spark-latest/lib/datanucleus-api-jdo-3.2.6.j AR已经注册了,你尝试注册位于URL相同的插件的文件中:/ opt /火花-1.6.1-bin- hadoop2.6 / lib目录/ DataNucleus的-API JDO-3.2.6.jar。
16/04/08 15点50分45秒WARN ObjectStore的:版本信息metastore没有找到。 hive.metastore.schema.verification没有启用,因此记录架构版本1.2.0
16/04/08 15点50分45秒WARN的ObjectStore:无法获取数据库默认情况下,返回NoSuchObjectException
16/04/08 15点50分49秒WARN一般:插件(包)org.datanucleus已注册。确保你没有在classpath同一插件的多个版本的jar。该URL的file:/opt/spark-latest/lib/datanucleus-core-3.2.10.jar已被注册,并且您尝试注册位于URL的文件相同的插件:/opt/spark-1.6。 1彬hadoop2.6 / lib目录/ DataNucleus的核心-3.2.10.jar。
16/04/08 15点50分49秒WARN一般:插件(包)org.datanucleus.store.rdbms已注册。确保你没有在classpath同一插件的多个版本的jar。该URL的file:/opt/spark-latest/lib/datanucleus-rdbms-3.2.9.jar已被注册,并且您尝试注册位于URL的文件相同的插件:/opt/spark-1.6。 1彬hadoop2.6 / lib目录/ DataNucleus的-RDBMS-3.2.9.jar。
16/04/08 15点50分49秒WARN一般:插件(包)org.datanucleus.api.jdo已注册。确保你没有在classpath同一插件的多个版本的jar。该URL的file:/opt/spark-latest/lib/datanucleus-api-jdo-3.2.6.jar已被注册,并且您尝试注册位于URL的文件相同的插件中:/ opt /火花1.6.1彬hadoop2.6 / lib目录/ DataNucleus的-API JDO-3.2.6.jar。
16/04/08 15点51分09秒WARN ObjectStore的:版本信息metastore没有找到。 hive.metastore.schema.verification没有启用,因此记录架构版本1.2.0
16/04/08 15点51分09秒WARN的ObjectStore:无法获取数据库默认情况下,返回NoSuchObjectException
作为sqlContext SQL上下文。斯卡拉>


解决方案

您选择了Scala的2.11版本的神器 s_2.11 。您最有可能使用使用Scala 2.10内置星火所以使用 s_2.10 神器

 火花壳--packages datastax:火花卡桑德拉连接器:1.6.0-M1-s_2.10

As per

https://github.com/datastax/spark-cassandra-connector
http://spark-packages.org/package/datastax/spark-cassandra-connector

I did the command but at the end it looks like there are errors. Are these fatal or do I need to resolve them?

[idf@node1 bin]$ spark-shell --packages datastax:spark-cassandra-connector:1.6.0-M1-s_2.11
Ivy Default Cache set to: /home/idf/.ivy2/cache
The jars for the packages stored in: /home/idf/.ivy2/jars
:: loading settings :: url = jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
datastax#spark-cassandra-connector added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0
        confs: [default]
        found datastax#spark-cassandra-connector;1.6.0-M1-s_2.11 in spark-packages
        found org.apache.cassandra#cassandra-clientutil;3.0.2 in central
        found com.datastax.cassandra#cassandra-driver-core;3.0.0 in central
        found io.netty#netty-handler;4.0.33.Final in central
        found io.netty#netty-buffer;4.0.33.Final in central
        found io.netty#netty-common;4.0.33.Final in central
        found io.netty#netty-transport;4.0.33.Final in central
        found io.netty#netty-codec;4.0.33.Final in central
        found io.dropwizard.metrics#metrics-core;3.1.2 in list
        found org.slf4j#slf4j-api;1.7.7 in central
        found org.apache.commons#commons-lang3;3.3.2 in list
        found com.google.guava#guava;16.0.1 in central
        found org.joda#joda-convert;1.2 in central
        found joda-time#joda-time;2.3 in central
        found com.twitter#jsr166e;1.1.0 in central
        found org.scala-lang#scala-reflect;2.11.7 in list
        [2.11.7] org.scala-lang#scala-reflect;2.11.7
downloading http://dl.bintray.com/spark-packages/maven/datastax/spark-cassandra-connector/1.6.0-M1-s_2.11/spark-cassandra-connector-1.6.0-M1-s_2.11.jar ...
        [SUCCESSFUL ] datastax#spark-cassandra-connector;1.6.0-M1-s_2.11!spark-cassandra-connector.jar (2430ms)
downloading https://repo1.maven.org/maven2/org/apache/cassandra/cassandra-clientutil/3.0.2/cassandra-clientutil-3.0.2.jar ...
        [SUCCESSFUL ] org.apache.cassandra#cassandra-clientutil;3.0.2!cassandra-clientutil.jar (195ms)
downloading https://repo1.maven.org/maven2/com/datastax/cassandra/cassandra-driver-core/3.0.0/cassandra-driver-core-3.0.0.jar ...
        [SUCCESSFUL ] com.datastax.cassandra#cassandra-driver-core;3.0.0!cassandra-driver-core.jar(bundle) (874ms)
downloading https://repo1.maven.org/maven2/com/google/guava/guava/16.0.1/guava-16.0.1.jar ...
        [SUCCESSFUL ] com.google.guava#guava;16.0.1!guava.jar(bundle) (1930ms)
downloading https://repo1.maven.org/maven2/org/joda/joda-convert/1.2/joda-convert-1.2.jar ...
        [SUCCESSFUL ] org.joda#joda-convert;1.2!joda-convert.jar (68ms)
downloading https://repo1.maven.org/maven2/joda-time/joda-time/2.3/joda-time-2.3.jar ...
        [SUCCESSFUL ] joda-time#joda-time;2.3!joda-time.jar (524ms)
downloading https://repo1.maven.org/maven2/com/twitter/jsr166e/1.1.0/jsr166e-1.1.0.jar ...
        [SUCCESSFUL ] com.twitter#jsr166e;1.1.0!jsr166e.jar (138ms)
downloading https://repo1.maven.org/maven2/io/netty/netty-handler/4.0.33.Final/netty-handler-4.0.33.Final.jar ...
        [SUCCESSFUL ] io.netty#netty-handler;4.0.33.Final!netty-handler.jar (266ms)
downloading https://repo1.maven.org/maven2/io/netty/netty-buffer/4.0.33.Final/netty-buffer-4.0.33.Final.jar ...
        [SUCCESSFUL ] io.netty#netty-buffer;4.0.33.Final!netty-buffer.jar (202ms)
downloading https://repo1.maven.org/maven2/io/netty/netty-transport/4.0.33.Final/netty-transport-4.0.33.Final.jar ...
        [SUCCESSFUL ] io.netty#netty-transport;4.0.33.Final!netty-transport.jar (330ms)
downloading https://repo1.maven.org/maven2/io/netty/netty-codec/4.0.33.Final/netty-codec-4.0.33.Final.jar ...
        [SUCCESSFUL ] io.netty#netty-codec;4.0.33.Final!netty-codec.jar (157ms)
downloading https://repo1.maven.org/maven2/io/netty/netty-common/4.0.33.Final/netty-common-4.0.33.Final.jar ...
        [SUCCESSFUL ] io.netty#netty-common;4.0.33.Final!netty-common.jar (409ms)
downloading https://repo1.maven.org/maven2/org/slf4j/slf4j-api/1.7.7/slf4j-api-1.7.7.jar ...
        [SUCCESSFUL ] org.slf4j#slf4j-api;1.7.7!slf4j-api.jar (57ms)
:: resolution report :: resolve 5827ms :: artifacts dl 7749ms
        :: modules in use:
        com.datastax.cassandra#cassandra-driver-core;3.0.0 from central in [default]
        com.google.guava#guava;16.0.1 from central in [default]
        com.twitter#jsr166e;1.1.0 from central in [default]
        datastax#spark-cassandra-connector;1.6.0-M1-s_2.11 from spark-packages in [default]
        io.dropwizard.metrics#metrics-core;3.1.2 from list in [default]
        io.netty#netty-buffer;4.0.33.Final from central in [default]
        io.netty#netty-codec;4.0.33.Final from central in [default]
        io.netty#netty-common;4.0.33.Final from central in [default]
        io.netty#netty-handler;4.0.33.Final from central in [default]
        io.netty#netty-transport;4.0.33.Final from central in [default]
        joda-time#joda-time;2.3 from central in [default]
        org.apache.cassandra#cassandra-clientutil;3.0.2 from central in [default]
        org.apache.commons#commons-lang3;3.3.2 from list in [default]
        org.joda#joda-convert;1.2 from central in [default]
        org.scala-lang#scala-reflect;2.11.7 from list in [default]
        org.slf4j#slf4j-api;1.7.7 from central in [default]
        ---------------------------------------------------------------------
        |                  |            modules            ||   artifacts   |
        |       conf       | number| search|dwnlded|evicted|| number|dwnlded|
        ---------------------------------------------------------------------
        |      default     |   16  |   13  |   13  |   0   ||   16  |   13  |
        ---------------------------------------------------------------------

:: problems summary ::
:::: ERRORS
        unknown resolver sbt-chain

        unknown resolver null

        unknown resolver sbt-chain

        unknown resolver null

        unknown resolver sbt-chain

        unknown resolver null

        unknown resolver sbt-chain

        unknown resolver null

        unknown resolver null

        unknown resolver sbt-chain

        unknown resolver null


:: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
:: retrieving :: org.apache.spark#spark-submit-parent
        confs: [default]
        16 artifacts copied, 0 already retrieved (12730kB/549ms)
16/04/08 14:48:19 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 1.6.1
      /_/

Using Scala version 2.10.5 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_45)
Type in expressions to have them evaluated.
Type :help for more information.
error: bad symbolic reference. A signature in package.class refers to type compileTimeOnly
in package scala.annotation which is not available.
It may be completely missing from the current classpath, or the version on
the classpath might be incompatible with the version used when compiling package.class.
<console>:14: error: Reference to value sc should not have survived past type checking,
it should have been processed and eliminated during expansion of an enclosing macro.
                @transient val sc = {
                               ^
<console>:15: error: Reference to method createSQLContext in class SparkILoop should not have survived past type checking,
it should have been processed and eliminated during expansion of an enclosing macro.
                  val _sqlContext = org.apache.spark.repl.Main.interp.createSQLContext()
                                                                      ^
<console>:14: error: Reference to value sqlContext should not have survived past type checking,
it should have been processed and eliminated during expansion of an enclosing macro.
                @transient val sqlContext = {
                               ^
<console>:16: error: not found: value sqlContext
         import sqlContext.implicits._
                ^
<console>:16: error: not found: value sqlContext
         import sqlContext.sql
                ^

scala>

EDIT 1

When choosing the correct scala version, it seems to get much further, but I am uncertain if the output below still has what appears to be errors that need resolving:

[idf@node1 bin]$ spark-shell --packages datastax:spark-cassandra-connector:1.6.0-M1-s_2.10
Ivy Default Cache set to: /home/idf/.ivy2/cache
The jars for the packages stored in: /home/idf/.ivy2/jars
:: loading settings :: url = jar:file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apach                                                                  e/ivy/core/settings/ivysettings.xml
datastax#spark-cassandra-connector added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0
        confs: [default]
        found datastax#spark-cassandra-connector;1.6.0-M1-s_2.10 in spark-packages
        found org.apache.cassandra#cassandra-clientutil;3.0.2 in central
        found com.datastax.cassandra#cassandra-driver-core;3.0.0 in central
        found io.netty#netty-handler;4.0.33.Final in central
        found io.netty#netty-buffer;4.0.33.Final in central
        found io.netty#netty-common;4.0.33.Final in central
        found io.netty#netty-transport;4.0.33.Final in central
        found io.netty#netty-codec;4.0.33.Final in central
        found io.dropwizard.metrics#metrics-core;3.1.2 in list
        found org.slf4j#slf4j-api;1.7.7 in central
        found org.apache.commons#commons-lang3;3.3.2 in list
        found com.google.guava#guava;16.0.1 in central
        found org.joda#joda-convert;1.2 in central
        found joda-time#joda-time;2.3 in central
        found com.twitter#jsr166e;1.1.0 in central
        found org.scala-lang#scala-reflect;2.10.5 in list
downloading http://dl.bintray.com/spark-packages/maven/datastax/spark-cassandra-connector/1.6.0-M1-s_2.10/spark-cassandr                                                                  a-connector-1.6.0-M1-s_2.10.jar ...
        [SUCCESSFUL ] datastax#spark-cassandra-connector;1.6.0-M1-s_2.10!spark-cassandra-connector.jar (2414ms)
:: resolution report :: resolve 3281ms :: artifacts dl 2430ms
        :: modules in use:
        com.datastax.cassandra#cassandra-driver-core;3.0.0 from central in [default]
        com.google.guava#guava;16.0.1 from central in [default]
        com.twitter#jsr166e;1.1.0 from central in [default]
        datastax#spark-cassandra-connector;1.6.0-M1-s_2.10 from spark-packages in [default]
        io.dropwizard.metrics#metrics-core;3.1.2 from list in [default]
        io.netty#netty-buffer;4.0.33.Final from central in [default]
        io.netty#netty-codec;4.0.33.Final from central in [default]
        io.netty#netty-common;4.0.33.Final from central in [default]
        io.netty#netty-handler;4.0.33.Final from central in [default]
        io.netty#netty-transport;4.0.33.Final from central in [default]
        joda-time#joda-time;2.3 from central in [default]
        org.apache.cassandra#cassandra-clientutil;3.0.2 from central in [default]
        org.apache.commons#commons-lang3;3.3.2 from list in [default]
        org.joda#joda-convert;1.2 from central in [default]
        org.scala-lang#scala-reflect;2.10.5 from list in [default]
        org.slf4j#slf4j-api;1.7.7 from central in [default]
        ---------------------------------------------------------------------
        |                  |            modules            ||   artifacts   |
        |       conf       | number| search|dwnlded|evicted|| number|dwnlded|
        ---------------------------------------------------------------------
        |      default     |   16  |   6   |   6   |   0   ||   16  |   1   |
        ---------------------------------------------------------------------

:: problems summary ::
:::: ERRORS
        unknown resolver null

        unknown resolver sbt-chain

        unknown resolver null


:: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
:: retrieving :: org.apache.spark#spark-submit-parent
        confs: [default]
        2 artifacts copied, 14 already retrieved (5453kB/69ms)
16/04/08 15:50:20 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java cl                                                                  asses where applicable
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 1.6.1
      /_/

Using Scala version 2.10.5 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_45)
Type in expressions to have them evaluated.
Type :help for more information.
Spark context available as sc.
16/04/08 15:50:28 WARN General: Plugin (Bundle) "org.datanucleus" is already registered. Ensure you dont have multiple J                                                                  AR versions of the same plugin in the classpath. The URL "file:/opt/spark-latest/lib/datanucleus-core-3.2.10.jar" is alr                                                                  eady registered, and you are trying to register an identical plugin located at URL "file:/opt/spark-1.6.1-bin-hadoop2.6/                                                                  lib/datanucleus-core-3.2.10.jar."
16/04/08 15:50:28 WARN General: Plugin (Bundle) "org.datanucleus.store.rdbms" is already registered. Ensure you dont hav                                                                  e multiple JAR versions of the same plugin in the classpath. The URL "file:/opt/spark-latest/lib/datanucleus-rdbms-3.2.9                                                                  .jar" is already registered, and you are trying to register an identical plugin located at URL "file:/opt/spark-1.6.1-bi                                                                  n-hadoop2.6/lib/datanucleus-rdbms-3.2.9.jar."
16/04/08 15:50:28 WARN General: Plugin (Bundle) "org.datanucleus.api.jdo" is already registered. Ensure you dont have mu                                                                  ltiple JAR versions of the same plugin in the classpath. The URL "file:/opt/spark-latest/lib/datanucleus-api-jdo-3.2.6.j                                                                  ar" is already registered, and you are trying to register an identical plugin located at URL "file:/opt/spark-1.6.1-bin-                                                                  hadoop2.6/lib/datanucleus-api-jdo-3.2.6.jar."
16/04/08 15:50:45 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.2.0
16/04/08 15:50:45 WARN ObjectStore: Failed to get database default, returning NoSuchObjectException
16/04/08 15:50:49 WARN General: Plugin (Bundle) "org.datanucleus" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/opt/spark-latest/lib/datanucleus-core-3.2.10.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/opt/spark-1.6.1-bin-hadoop2.6/lib/datanucleus-core-3.2.10.jar."
16/04/08 15:50:49 WARN General: Plugin (Bundle) "org.datanucleus.store.rdbms" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/opt/spark-latest/lib/datanucleus-rdbms-3.2.9.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/opt/spark-1.6.1-bin-hadoop2.6/lib/datanucleus-rdbms-3.2.9.jar."
16/04/08 15:50:49 WARN General: Plugin (Bundle) "org.datanucleus.api.jdo" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/opt/spark-latest/lib/datanucleus-api-jdo-3.2.6.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/opt/spark-1.6.1-bin-hadoop2.6/lib/datanucleus-api-jdo-3.2.6.jar."
16/04/08 15:51:09 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.2.0
16/04/08 15:51:09 WARN ObjectStore: Failed to get database default, returning NoSuchObjectException
SQL context available as sqlContext.

scala>

解决方案

You chose the Scala 2.11 version of the artifact s_2.11. You are most likely using Spark built with Scala 2.10 so use the s_2.10 artifact

spark-shell --packages datastax:spark-cassandra-connector:1.6.0-M1-s_2.10

这篇关于安装卡桑德拉火花连接器的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆