我需要导入什么才能使`SparkConf`可解析? [英] What do I need to import to make `SparkConf` resolvable?

查看:407
本文介绍了我需要导入什么才能使`SparkConf`可解析?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在设置Java Spark应用程序,并遵循有关Java API入门的Datastax文档。我已添加

I am setting up a Java Spark application and am following the Datastax documentation on getting started with the Java API. I've added

<dependencies>
    <dependency>
        <groupId>com.datastax.spark</groupId>
        <artifactId>spark-cassandra-connector-java_2.10</artifactId>
        <version>1.1.1</version>
    </dependency>
    ...
</dependencies>

和(先前安装的 dse.jar 到我的本地Maven存储库)

and (a previously installed dse.jar to my local Maven repository)

<dependency>
    <groupId>com.datastax</groupId>
    <artifactId>dse</artifactId>
    <version>version number</version>
</dependency>

。本指南的下一步是

SparkConf conf = DseSparkConfHelper.enrichSparkConf(new SparkConf())
                .setAppName( "My application");
DseSparkContext sc = new DseSparkContext(conf);

。但是, SparkConf 类无法解析。应该是?我是否缺少一些其他的Maven依赖关系?

. However, the class SparkConf can't be resolved. Should it? Am I missing some additional Maven dependency? Which?

推荐答案

该类是 org.apache.spark.SparkConf 在spark-core_ scala版本工件中。

The class is org.apache.spark.SparkConf which is in the spark-core_scala version artifact.

因此,您的pom.xml可能如下所示:

So your pom.xml might look like this:

<dependencies>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.10</artifactId>
        <version>1.4.1</version>
    </dependency>
    <dependency>
        <groupId>com.datastax.spark</groupId>
        <artifactId>spark-cassandra-connector-java_2.10</artifactId>
        <version>1.5.0-M2</version>
    </dependency>
    <dependency>
        <groupId>com.datastax</groupId>
        <artifactId>dse</artifactId>
        <version>*version number*</version>
    </dependency>
</dependencies>

Spark-core JAR也位于:
dse_install /resources/spark/lib/spark_core_2.10- 版本 .jar(压缩文件)
或:
/usr/share/dse/spark/lib/spark_core_2.10- 版本 .jar(软件包安装)

The spark-core JAR is also located in: dse_install/resources/spark/lib/spark_core_2.10-version.jar (tarball) or: /usr/share/dse/spark/lib/spark_core_2.10-version.jar (package installs)

这篇关于我需要导入什么才能使`SparkConf`可解析?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆