当cosmosdb Lib时,Spark库会合用 [英] Spark libraries conflect when cosmosdb Lib

查看:49
本文介绍了当cosmosdb Lib时,Spark库会合用的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我一直与cosmosdb库发生火花库冲突,无法解决它.请帮忙???

I keep getting a spark library conflict with cosmosdb libraries and unable to resolve it. Please help???

build.sbt

build.sbt

name := "myApp"
version := "1.0"
scalaVersion := "2.11.8"

libraryDependencies ++= Seq(
"org.apache.spark" % "spark-core_2.11" % "2.3.0",
"org.apache.spark" % "spark-sql_2.11" % "2.3.0" ,
"org.apache.spark" % "spark-streaming_2.11" % "2.3.0" ,
"org.apache.spark" % "spark-mllib_2.11" % "2.3.0" ,
"com.microsoft.azure" % "azure-storage" % "2.0.0",
"org.apache.hadoop" % "hadoop-azure" % "2.7.3",
"com.microsoft.azure" % "azure-cosmosdb-spark_2.2.0_2.11" % "1.0.0",
"com.microsoft.azure" % "azure-documentdb" % "1.14.2" ,
"com.microsoft.azure" % "azure-documentdb-rx" % "0.9.0-rc2" ,
"io.reactivex" % "rxjava" % "1.3.0" ,
"io.reactivex" % "rxnetty" % "0.4.20",
"org.json" % "json" % "20140107",
"org.jmockit" % "jmockit" % "1.34" % "test"
)

我遇到的编译错误是:

[warn] Run 'evicted' to see detailed eviction warnings
[error] Modules were resolved with conflicting cross-version suffixes in 
[error]    org.apache.spark:spark-launcher _2.11, _2.10
[error]    org.json4s:json4s-ast _2.11, _2.10
[error]    org.apache.spark:spark-network-shuffle _2.11, _2.10
[error]    com.twitter:chill _2.11, _2.10
[error]    org.json4s:json4s-jackson _2.11, _2.10
[error]    com.fasterxml.jackson.module:jackson-module-scala _2.11, _2.10
[error]    org.json4s:json4s-core _2.11, _2.10
[error]    org.apache.spark:spark-unsafe _2.11, _2.10
[error]    org.apache.spark:spark-core _2.11, _2.10
[error]    org.apache.spark:spark-network-common _2.11, _2.10
[error] java.lang.RuntimeException: Conflicting cross-version suffixes in: org.apache.spark:spark-launcher, org.json4s:json4s-ast, org.apache.spark:spark-network-shuffle, com.twitter:chill, org.json4s:json4s-jackson, com.fasterxml.jackson.module:jackson-module-scala, org.json4s:json4s-core, org.apache.spark:spark-unsafe, org.apache.spark:spark-core, org.apache.spark:spark-network-common

谢谢

推荐答案

我遇到了这个问题.为了避免冲突,您可能必须将Scala 2.10与上述版本配合使用.

I have come across this issue. You probably have to use Scala 2.10 with the above inorder to avoid conflict.

这篇关于当cosmosdb Lib时,Spark库会合用的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆