如何使用外部 jars 和 Maven 编译/打包 Spark 2.0 项目 [英] How to compile/package Spark 2.0 project with external jars and Maven

查看:26
本文介绍了如何使用外部 jars 和 Maven 编译/打包 Spark 2.0 项目的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

从 2.0 版开始,Apache Spark 与一个包含 .jar 文件的文件夹jars"捆绑在一起.显然Maven会在发布时下载所有这些jar:

Since version 2.0, Apache Spark is bundled with a folder "jars" full of .jar files. Obviously Maven will download all these jars when issuing:

mvn -e package

因为为了提交申请

spark-submit --class DataFetch target/DataFetch-1.0-SNAPSHOT.jar

需要DataFetch-1.0-SNAPSHOT.jar.

所以,第一个问题很简单,我怎样才能利用这些现有的罐子?第二个问题是相关的,儿子我第一次尝试使用 Maven 下载 jars,我得到以下输出:

So, the first question is straightforward, how can I take advantage of these existing jars?. The second question is related, son I've tried the first time with Maven downloading the jars, I've got the following output:

[INFO] Error stacktraces are turned on.
[INFO] Scanning for projects...
[INFO]
[INFO] ------------------------------------------------------------------------
[INFO] Building "DataFetch" 1.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO]
[INFO] --- maven-resources-plugin:2.5:resources (default-resources) @DataFetch ---
[debug] execute contextualize
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory     /root/sparkTests/scalaScripts/DataFetch/src/main/resources
[INFO]
[INFO] --- maven-compiler-plugin:3.0:compile (default-compile) @ DataFetch -    --
[INFO] No sources to compile
[INFO]
[INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ DataFetch ---
[debug] execute contextualize
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory/root/sparkTests/scalaScripts/DataFetch/src/test/resources
[INFO]
[INFO] --- maven-compiler-plugin:3.0:testCompile (default-testCompile) @ DataFetch ---
[INFO] No sources to compile
[INFO]
[INFO] --- maven-surefire-plugin:2.10:test (default-test) @ DataFetch ---
[INFO] No tests to run.
[INFO] Surefire report directory: /root/sparkTests/scalaScripts/DataFetch/target/surefire-reports

-------------------------------------------------------
 T E S T S
-------------------------------------------------------

Results :

Tests run: 0, Failures: 0, Errors: 0, Skipped: 0

[INFO]
[INFO] --- maven-jar-plugin:2.3.2:jar (default-jar) @ DataFetch---
[WARNING] JAR will be empty - no content was marked for inclusion!
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 4.294s
[INFO] Finished at: Wed Sep 28 17:41:29 PYT 2016
[INFO] Final Memory: 14M/71M
[INFO] ------------------------------------------------------------------------

这是我的 pom.xml 文件

And here is my pom.xml file

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
  xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
  <modelVersion>4.0.0</modelVersion>

  <groupId>com.spark.pg</groupId>
  <artifactId>DataFetch</artifactId>
  <version>1.0-SNAPSHOT</version>
  <packaging>jar</packaging>
  <name>"DataFetch"</name>

  <properties>
    <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
  </properties>

    <dependencies>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.11</artifactId>
            <version>2.0.0</version>
        </dependency>
    </dependencies>

     <build>
        <plugins>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-compiler-plugin</artifactId>
                <version>3.0</version>
            </plugin>
        </plugins>
    </build>  

</project>

如果需要更多信息,请不要犹豫.

If more information is needed, please don't hesitate to ask for it.

推荐答案

我不确定我是否理解你的问题,但我试着回答.

I am not sure whether I understand your problem, but I try to answer.

基于 Spark 捆绑您的应用程序的依赖项 文档:

在创建程序集 jar 时,列出 Spark 和 Hadoop依赖关系;这些不需要捆绑,因为它们是由运行时的集群管理器.

When creating assembly jars, list Spark and Hadoop as provided dependencies; these need not be bundled since they are provided by the cluster manager at runtime.

您可以在 maven pom.xml 文件中将范围设置为 provided

You can set scope to provided in maven pom.xml file

<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_2.11</artifactId>
    <version>${spark.version}</version>
    <!-- add this scope -->
    <scope>provided</scope>
</dependency>

我注意到的第二个想法是 maven 构建创建了空 JAR.

The second think I noticed is that maven build creates empty JAR.

[WARNING] JAR will be empty - no content was marked for inclusion!

如果你有任何其他依赖,你应该将这些依赖打包到最终的jar归档文件中.

If you have any other dependencies, you should package these dependencies into final jar archive file.

您可以在 pom.xml 中执行以下操作并运行 mvn package:

You can do something like below in pom.xml and run mvn package:

    <plugin>
        <groupId>org.apache.maven.plugins</groupId>
        <artifactId>maven-assembly-plugin</artifactId>
        <version>2.6</version>
        <configuration>
            <!-- package with project dependencies -->
            <descriptorRefs>
                <descriptorRef>jar-with-dependencies</descriptorRef>
            </descriptorRefs>
            <archive>
              <manifest>
                <mainClass>YOUR_MAIN_CLASS</mainClass>
              </manifest>
            </archive>
            </configuration>
            <executions>
              <execution>
                <id>make-assembly</id>
                <phase>package</phase>
                <goals>
                    <goal>single</goal>
                </goals>
              </execution>
            </executions>
    </plugin>

Maven 日志应该打印与构建 jar 的行:

Maven log should print line with building jar:

[INFO] --- maven-assembly-plugin:2.4.1:single (make-assembly) @ dateUtils ---
[INFO] Building jar: path/target/APPLICATION_NAME-jar-with-dependencies.jar

在目标文件夹中的 maven 打包阶段之后,您应该会看到 DataFetch-1.0-SNAPSHOTjar-with-dependencies.jar,并且您可以使用 spark-submit 将此 jar 汇总

After maven packaging phase in the target folder you should see DataFetch-1.0-SNAPSHOTjar-with-dependencies.jar and you can sumbit this jar with spark-submit

这篇关于如何使用外部 jars 和 Maven 编译/打包 Spark 2.0 项目的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆