sbt-cassandra-connector 2.0.2的sbt未解决依赖性 [英] sbt unresolved dependency for spark-cassandra-connector 2.0.2
本文介绍了sbt-cassandra-connector 2.0.2的sbt未解决依赖性的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!
问题描述
build.sbt:
build.sbt:
val sparkVersion = "2.1.1";
libraryDependencies += "org.apache.spark" %% "spark-core" % sparkVersion % "provided";
libraryDependencies += "org.apache.spark" %% "spark-sql" % sparkVersion % "provided";
libraryDependencies += "org.apache.spark" %% "spark-streaming" % sparkVersion % "provided";
libraryDependencies += "com.datastax.spark" % "spark-cassandra-connector" % "2.0.2";
libraryDependencies += "org.apache.spark" % "spark-streaming-kafka-0-10_2.11" % sparkVersion;
输出:
[error] (myproject/*:update) sbt.ResolveException: unresolved dependency: com.datastax.spark#spark-cassandra-connector;2.0.2: not found
有什么主意吗?我是sbt和spark的新手.谢谢
Any idea? I am new to sbt and spark. Thanks
推荐答案
这是由于"com.datastax.spark" % "spark-cassandra-connector" % "2.0.2";
没有 scala版本引起的,请参见 Maven回购:
This is caused by "com.datastax.spark" % "spark-cassandra-connector" % "2.0.2";
without scala version, see maven repo:
有2种解决方案:
-
"com.datastax.spark" % "spark-cassandra-connector_2.11" % "2.0.2"
明确设置 Scala版本以获得依赖性 -
"com.datastax.spark" %% "spark-cassandra-connector" % "2.0.2"
,将%%
与工件ID 一起使用,这样, SBT 将自动基于您项目的 scala版本扩展到解决方案1 .
"com.datastax.spark" % "spark-cassandra-connector_2.11" % "2.0.2"
explicitly set Scala version for dependency"com.datastax.spark" %% "spark-cassandra-connector" % "2.0.2"
, use%%
with artifact id, this way, the SBT will auto base on your project's scala version to expand to the solution 1.
这篇关于sbt-cassandra-connector 2.0.2的sbt未解决依赖性的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!
查看全文