使用xgboost的Spark Scala的sbt构建失败 [英] sbt build failed for spark scala with xgboost

查看:169
本文介绍了使用xgboost的Spark Scala的sbt构建失败的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

./build/sbt清洁包 给出以下错误:

./build/sbt clean package gives the below error:

Resolving org.fusesource.jansi#jansi;1.4 ...
   [warn]   ::::::::::::::::::::::::::::::::::::::::::::::
   [warn]   ::          UNRESOLVED DEPENDENCIES         ::
   [warn]   ::::::::::::::::::::::::::::::::::::::::::::::
   [warn]   :: ml.dmlc.xgboost#xgboost4j_2.10;0.7: not found
   [warn]   :: ml.dmlc.xgboost#xgboost4j-spark_2.10;0.7: not found
   [warn]   ::::::::::::::::::::::::::::::::::::::::::::::

build.sbt如下所示:

build.sbt looks like below:

name := "xgboostproj"
version := "1.0"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.6.1"
libraryDependencies += "org.apache.spark" %% "spark-mllib" % "1.6.1"
resolvers += Resolver.mavenLocal
libraryDependencies += "ml.dmlc.xgboost" %% "xgboost4j" % "0.7"
libraryDependencies += "ml.dmlc.xgboost" %% "xgboost4j-spark" % "0.7"

预先感谢!

推荐答案

xgboost jar必须在本地构建并发布到本地maven存储库中,才能进行设置.有关说明,请此处

xgboost jars must be built locally and published to your local maven repository for your set up to work. The instructions for this is published here

此外,依赖关系应如下所示(groupId为ml.dmlc)

Also additionally the dependencies should be like below (the groupId is ml.dmlc)

libraryDependencies += "ml.dmlc" %% "xgboost4j" % "0.7"
libraryDependencies += "ml.dmlc" %% "xgboost4j-spark" % "0.7"

这篇关于使用xgboost的Spark Scala的sbt构建失败的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆