尝试创建 jar 时出现 UNRESOLVED DEPENDENCIES 错误 [英] UNRESOLVED DEPENDENCIES error while trying to create jar

查看:212
本文介绍了尝试创建 jar 时出现 UNRESOLVED DEPENDENCIES 错误的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试构建一个 Scala jar 文件以在 spark 中运行它.
我正在关注这个教程.
当尝试使用 sbt 作为 here 构建 jar 文件时,我面临以下错误

I'm trying to build a Scala jar file to run it in spark.
I'm following this tutorial.
when trying to build jar file using sbt as here, i'm facing with following error

[info] Resolving org.apache.spark#spark-core_2.10.4;1.0.2 ...
[warn]  module not found: org.apache.spark#spark-core_2.10.4;1.0.2
[warn] ==== local: tried
[warn]   /home/hduser/.ivy2/local/org.apache.spark/spark-core_2.10.4/1.0.2/ivys/ivy.xml
[warn] ==== Akka Repository: tried
[warn]   http://repo.akka.io/releases/org/apache/spark/spark-core_2.10.4/1.0.2/spark-core_2.10.4-1.0.2.pom
[warn] ==== public: tried
[warn]   http://repo1.maven.org/maven2/org/apache/spark/spark-core_2.10.4/1.0.2/spark-core_2.10.4-1.0.2.pom
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]  ::          UNRESOLVED DEPENDENCIES         ::
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]  :: org.apache.spark#spark-core_2.10.4;1.0.2: not found
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[error] {file:/home/prithvi/scala/asd/}default-d57abf/*:update: sbt.ResolveException: unresolved dependency: org.apache.spark#spark-core_2.10.4;1.0.2: not found
[error] Total time: 2 s, completed 13 Aug, 2014 5:24:24 PM

问题是什么以及如何解决.

what's the issue and how to solve it.

依赖问题已解决.谢谢om-nom-nom"
但出现了新错误

Dependency issue has been resolved. Thank you "om-nom-nom"
but new error arised

[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]  ::              FAILED DOWNLOADS            ::
[warn]  :: ^ see resolution messages for details  ^ ::
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]  :: org.eclipse.jetty.orbit#javax.transaction;1.1.1.v201105210645!javax.transaction.orbit
[warn]  :: org.eclipse.jetty.orbit#javax.servlet;3.0.0.v201112011016!javax.servlet.orbit
[warn]  :: org.eclipse.jetty.orbit#javax.mail.glassfish;1.4.1.v201005082020!javax.mail.glassfish.orbit
[warn]  :: org.eclipse.jetty.orbit#javax.activation;1.1.0.v201105071233!javax.activation.orbit
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[error] {file:/home/prithvi/scala/asd/}default-c011e4/*:update: sbt.ResolveException: download failed: org.eclipse.jetty.orbit#javax.transaction;1.1.1.v201105210645!javax.transaction.orbit
[error] download failed: org.eclipse.jetty.orbit#javax.servlet;3.0.0.v201112011016!javax.servlet.orbit
[error] download failed: org.eclipse.jetty.orbit#javax.mail.glassfish;1.4.1.v201005082020!javax.mail.glassfish.orbit
[error] download failed: org.eclipse.jetty.orbit#javax.activation;1.1.0.v201105071233!javax.activation.orbit
[error] Total time: 855 s, completed 14 Aug, 2014 12:28:33 PM

推荐答案

你的依赖定义为

"org.apache.spark" %% "spark-core" % "1.0.2"

%% 指示 sbt 将当前的 scala 版本替换为工件名称.显然,spark 是为 2.10 scala 的整个家族构建的,没有 2.10.1 的特定 jar,2.10.2 ...

That %% instructs sbt to substitute current scala version to artifact name. Apparently, spark was build for the whole family of 2.10 scala, without specific jars for 2.10.1, 2.10.2 ...

所以你所要做的就是将它重新定义为:

So all you have to do is to redefine it as:

"org.apache.spark" % "spark-core_2.10" % "1.0.2"

这篇关于尝试创建 jar 时出现 UNRESOLVED DEPENDENCIES 错误的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆