使用 xgboost 构建 spark scala 的 sbt 失败 [英] sbt build failed for spark scala with xgboost

查看:27
本文介绍了使用 xgboost 构建 spark scala 的 sbt 失败的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

./build/sbt 清理包给出以下错误:

./build/sbt clean package gives the below error:

Resolving org.fusesource.jansi#jansi;1.4 ...
   [warn]   ::::::::::::::::::::::::::::::::::::::::::::::
   [warn]   ::          UNRESOLVED DEPENDENCIES         ::
   [warn]   ::::::::::::::::::::::::::::::::::::::::::::::
   [warn]   :: ml.dmlc.xgboost#xgboost4j_2.10;0.7: not found
   [warn]   :: ml.dmlc.xgboost#xgboost4j-spark_2.10;0.7: not found
   [warn]   ::::::::::::::::::::::::::::::::::::::::::::::

build.sbt 如下所示:

build.sbt looks like below:

name := "xgboostproj"
version := "1.0"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.6.1"
libraryDependencies += "org.apache.spark" %% "spark-mllib" % "1.6.1"
resolvers += Resolver.mavenLocal
libraryDependencies += "ml.dmlc.xgboost" %% "xgboost4j" % "0.7"
libraryDependencies += "ml.dmlc.xgboost" %% "xgboost4j-spark" % "0.7"

提前致谢!

推荐答案

xgboost jar 必须在本地构建并发布到本地 maven 存储库,以便您的设置正常工作.相关说明发布在这里

xgboost jars must be built locally and published to your local maven repository for your set up to work. The instructions for this is published here

此外,依赖项应如下所示(groupId 为 ml.dmlc)

Also additionally the dependencies should be like below (the groupId is ml.dmlc)

libraryDependencies += "ml.dmlc" %% "xgboost4j" % "0.7"
libraryDependencies += "ml.dmlc" %% "xgboost4j-spark" % "0.7"

这篇关于使用 xgboost 构建 spark scala 的 sbt 失败的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆