MLlib依赖项错误 [英] Mllib dependency error

查看:235
本文介绍了MLlib依赖项错误的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试使用Mllib构建一个非常简单的scala独立应用程序,但是在尝试构建该程序时出现以下错误:

I'm trying to build a very simple scala standalone app using the Mllib, but I get the following error when trying to bulid the program:

Object Mllib is not a member of package org.apache.spark

然后,我意识到我必须添加Mllib作为依赖项,如下所示:

Then, I realized that I have to add Mllib as dependency as follow :

version := "1"
scalaVersion :="2.10.4"

libraryDependencies ++= Seq(
"org.apache.spark"  %% "spark-core"              % "1.1.0",
"org.apache.spark"  %% "spark-mllib"             % "1.1.0"
)

但是,我在这里看到一条错误消息:

But, here I got an error that says :

unresolved dependency spark-core_2.10.4;1.1.1:not found

所以我不得不将其修改为

so I had to modify it to

"org.apache.spark" % "spark-core_2.10" % "1.1.1",

但是仍然有一个错误提示:

But there is still an error that says :

unresolved dependency spark-mllib;1.1.1 : not found

任何人都知道如何在.sbt文件中添加Mllib的依赖关系吗?

Anyone knows how to add dependency of Mllib in .sbt file?

推荐答案

正如@lmm所指出的,您可以改为包含以下库:

As @lmm pointed out, you can instead include the libraries as:

libraryDependencies ++= Seq( "org.apache.spark" % "spark-core_2.10" % "1.1.0", "org.apache.spark" % "spark-mllib_2.10" % "1.1.0" )

libraryDependencies ++= Seq( "org.apache.spark" % "spark-core_2.10" % "1.1.0", "org.apache.spark" % "spark-mllib_2.10" % "1.1.0" )

在sbt中,%%包含scala版本,并且您正在使用scala版本2.10.4进行构建,而Spark工件通常是针对2.10发布的.

In sbt %% includes the scala version, and you are building with scala version 2.10.4 whereas the Spark artifacts are published against 2.10 in general.

应注意,如果您要制作一个组装罐来部署应用程序,则可能希望将spark-core标记为已提供的标记,例如

It should be noted that if you are going to make an assembly jar to deploy your application you may wish to mark spark-core as provided e.g.

libraryDependencies ++= Seq( "org.apache.spark" % "spark-core_2.10" % "1.1.0" % "provided", "org.apache.spark" % "spark-mllib_2.10" % "1.1.0" )

libraryDependencies ++= Seq( "org.apache.spark" % "spark-core_2.10" % "1.1.0" % "provided", "org.apache.spark" % "spark-mllib_2.10" % "1.1.0" )

由于spark-core包始终会在执行程序的路径中.

Since the spark-core package will be in the path on executor anyways.

这篇关于MLlib依赖项错误的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆