IntelliJ Idea 14:无法解析符号火花 [英] IntelliJ Idea 14: cannot resolve symbol spark

查看:1176
本文介绍了IntelliJ Idea 14:无法解析符号火花的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我在第一个项目中使用Spark的依赖。但是当我尝试用Spark创建一个新项目时,我的SBT不会导入org.apache.spark的外部jar。因此,IntelliJ Idea会给出无法解析符号的错误。
我已经尝试从头开始创建一个新项目并使用自动导入但没有效果。当我尝试编译时,我得到对象apache不是包org的成员的消息。我的build.sbt看起来像这样:

I made a dependency of Spark which worked in my first project. But when I try to make a new project with Spark, my SBT does not import the external jars of org.apache.spark. Therefore IntelliJ Idea gives the error that it "cannot resolve symbol". I already tried to make a new project from scratch and use auto-import but none works. When I try to compile I get the messages that "object apache is not a member of package org". My build.sbt looks like this:

name := "hello"
version := "1.0"
scalaVersion := "2.11.7"
libraryDependencies += "org.apache.spark" % "spark-parent_2.10" % "1.4.1"

我的印象是我的SBT设置可能有问题,尽管它已经有效了一次。除了外部库,一切都是一样的...
我也尝试导入我的spark依赖的pom.xml文件,但这也行不通。
提前谢谢!

I have the impression that there might be something wrong with my SBT settings, although it already worked one time. And except for the external libraries everything is the same... I also tried to import the pom.xml file of my spark dependency but that also doesn't work. Thank you in advance!

推荐答案

这对我有用 - >

name := "ProjectName"
version := "0.1"
scalaVersion := "2.11.11"

libraryDependencies ++= Seq(
  "org.apache.spark" % "spark-core_2.11" % "2.2.0",
  "org.apache.spark" % "spark-sql_2.11" % "2.2.0",
  "org.apache.spark" % "spark-mllib_2.10" % "1.1.0"
)

这篇关于IntelliJ Idea 14:无法解析符号火花的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆