spark-cassandra-connector 2.0.2 的 sbt 未解决依赖关系 [英] sbt unresolved dependency for spark-cassandra-connector 2.0.2

查看:35
本文介绍了spark-cassandra-connector 2.0.2 的 sbt 未解决依赖关系的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

build.sbt:

val sparkVersion = "2.1.1";

libraryDependencies += "org.apache.spark" %% "spark-core" % sparkVersion % "provided";
libraryDependencies += "org.apache.spark" %% "spark-sql" % sparkVersion % "provided";
libraryDependencies += "org.apache.spark" %% "spark-streaming" % sparkVersion % "provided";

libraryDependencies += "com.datastax.spark" % "spark-cassandra-connector" % "2.0.2";
libraryDependencies += "org.apache.spark" % "spark-streaming-kafka-0-10_2.11" % sparkVersion;

输出:

[error] (myproject/*:update) sbt.ResolveException: unresolved dependency: com.datastax.spark#spark-cassandra-connector;2.0.2: not found

有什么想法吗?我是 sbt 和 spark 的新手.谢谢

Any idea? I am new to sbt and spark. Thanks

推荐答案

这是由 "com.datastax.spark" % "spark-cassandra-connector" % "2.0.2"; 引起的;如果没有 scala 版本,请参阅 ma​​ven repo:

This is caused by "com.datastax.spark" % "spark-cassandra-connector" % "2.0.2"; without scala version, see maven repo:

http://search.maven.org/#artifactdetails%7Ccom.datastax.spark%7Cspark-cassandra-connector_2.11%7C2.0.2%7Cjar

有两种解决方案:

  1. "com.datastax.spark" % "spark-cassandra-connector_2.11" % "2.0.2" 显式设置 Scala 版本 以供依赖
  2. "com.datastax.spark" %% "spark-cassandra-connector" % "2.0.2",使用 %%artifact id,这样,SBT 将自动基于您项目的 scala 版本 扩展到解决方案 1.
  1. "com.datastax.spark" % "spark-cassandra-connector_2.11" % "2.0.2" explicitly set Scala version for dependency
  2. "com.datastax.spark" %% "spark-cassandra-connector" % "2.0.2", use %% with artifact id, this way, the SBT will auto base on your project's scala version to expand to the solution 1.

这篇关于spark-cassandra-connector 2.0.2 的 sbt 未解决依赖关系的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆