eclipse(使用scala环境进行设置):对象apache不是软件包org的成员 [英] eclipse(set with scala envirnment) : object apache is not a member of package org

查看:109
本文介绍了eclipse(使用scala环境进行设置):对象apache不是软件包org的成员的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

如图所示,当我导入Spark软件包时,它给出了错误.请帮忙.当我将鼠标悬停在那里时,它显示对象apache不是org包的成员".我搜索了此错误,它表明火花罐尚未导入.因此,我也导入了"spark-assembly-1.4.1-hadoop2.2.0.jar".但是仍然存在相同的错误.以下是我实际要运行的内容:

As shown in image, its giving error when i am importing the Spark packages. Please help. When i hover there, it shows "object apache is not a member of package org". I searched on this error, it shows spark jars has not been imported. So, i imported "spark-assembly-1.4.1-hadoop2.2.0.jar" too. But still same error.Below is what i actually want to run:

 import org.apache.spark.{SparkConf, SparkContext}
 object ABC {

 def main(args: Array[String]){
//Scala Main Method

println("Spark Configuration")

val conf = new SparkConf()

conf.setAppName("My First Spark Scala Application")

conf.setMaster("spark://ip-10-237-224-94:7077")

println("Creating Spark Context")
}
}

推荐答案

在类路径中添加 spark-core jar应该可以解决您的问题.另外,如果您使用的是一些构建工具,例如 Maven

Adding spark-core jar in your classpath should resolve your issue. Also if you are using some build tools like Maven or Gradle (if not then you should because spark-core has lot many dependencies and you would keep getting such problem for different jars), try to use Eclipse task provided by these tools to properly set classpath in your project.

这篇关于eclipse(使用scala环境进行设置):对象apache不是软件包org的成员的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆