如何在 IntelliJ IDEA 中使用 SBT 构建 Uber JAR(Fat JAR)? [英] How to build an Uber JAR (Fat JAR) using SBT within IntelliJ IDEA?

查看:46
本文介绍了如何在 IntelliJ IDEA 中使用 SBT 构建 Uber JAR(Fat JAR)?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在使用 SBT(在 IntelliJ IDEA 中)构建一个简单的 Scala 项目.

我想知道构建 Uber JAR 文件(又名 Fat JAR、Super JAR)的最简单方法是什么.

我目前正在使用 SBT,但是当我将 JAR 文件提交到 Apache Spark 时,我收到以下错误:><块引用>

线程main"中的异常 java.lang.SecurityException: InvalidManifest 主要属性的签名文件摘要

或者编译时出现这个错误:

<块引用>

java.lang.RuntimeException: 重复数据删除:发现不同的文件内容在以下:
PATHDEPENDENCY.jar:META-INF/DEPENDENCIES
PATHDEPENDENCY.jar:META-INF/MANIFEST.MF

看起来是因为我的一些依赖项包括签名文件(META-INF) 需要在最终的 Uber JAR 文件中删除.

我尝试使用 sbt-assembly 插件:

/project/assembly.sbt

addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.12.0")

/project/plugins.sbt

logLevel := Level.Warn

/build.sbt

lazy val commonSettings = Seq(名称:=火花测试"版本:=1.0"斯卡拉版本:=2.11.4")懒惰的 val app =(文件中的项目(app")).设置(通用设置:_*).设置(libraryDependencies ++= Seq("org.apache.spark" %% "spark-core" % "1.2.0","org.apache.spark" %% "spark-streaming" % "1.2.0",org.apache.spark"%spark-streaming-twitter_2.10"%1.2.0"))

当我在 IntelliJ IDEA 中单击Build Artifact..."时,我得到一个 JAR 文件.但我最终遇到了同样的错误...

我是 SBT 的新手,对 IntelliJ IDE 没有太多的尝试.

谢谢.

解决方案

最后我完全不使用 IntelliJ IDEA 以避免在我的全局理解中产生噪音:)

我开始阅读官方 SBT 教程.

我使用以下文件结构创建了我的项目:

my-project/project/assembly.sbt我的项目/src/main/scala/myPackage/MyMainObject.scala我的项目/build.sbt

添加了 sbt-assembly plugin 在我的 assembly.sbt 文件中.允许我构建一个胖 JAR :

addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.12.0")

我的最小 build.sbt 看起来像:

lazy val root = (project in file(".")).设置(名称:=我的项目",版本:=1.0",ScalaVersion := "2.11.4",编译中的 mainClass := Some("myPackage.MyMainObject"))val sparkVersion = "1.2.0"libraryDependencies ++= Seq("org.apache.spark" %% "spark-core" % sparkVersion % "provided","org.apache.spark" %% "spark-streaming" % sparkVersion % "provided","org.apache.spark" %% "spark-streaming-twitter" % sparkVersion)//META-INF 丢弃程序集中的合并策略<<=(程序集中的合并策略){(旧)=>{case PathList("META-INF", xs @ _*) =>合并策略.丢弃情况 x =>合并策略优先}}

注意:% "provided" 表示不在最终的胖 JAR 中包含依赖项(这些库已经包含在我的工作程序中)

注意:META-INF 丢弃受到这个答案的启发.

注意:%%%

现在我可以使用 SBT 构建我的胖 JAR(如何安装它) 在我的 /my-project 根文件夹中运行以下命令:

sbt 程序集

我的胖 JAR 现在位于新生成的 /target 文件夹中:

/my-project/target/scala-2.11/my-project-assembly-1.0.jar

希望能帮到别人.

<小时>

对于那些想要在 IntelliJ IDE 中嵌入 SBT 的人:如何从 IntelliJ IDEA 中运行 sbt-assembly 任务?

I'm using SBT (within IntelliJ IDEA) to build a simple Scala project.

I would like to know what is the simplest way to build an Uber JAR file (aka Fat JAR, Super JAR).

I'm currently using SBT but when I'm submiting my JAR file to Apache Spark I get the following error:

Exception in thread "main" java.lang.SecurityException: Invalid signature file digest for Manifest main attributes

Or this error during compilation time:

java.lang.RuntimeException: deduplicate: different file contents found in the following:
PATHDEPENDENCY.jar:META-INF/DEPENDENCIES
PATHDEPENDENCY.jar:META-INF/MANIFEST.MF

It looks like it is because some of my dependencies include signature files (META-INF) which needs to be removed in the final Uber JAR file.

I tried to use the sbt-assembly plugin like that:

/project/assembly.sbt

addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.12.0")

/project/plugins.sbt

logLevel := Level.Warn

/build.sbt

lazy val commonSettings = Seq(
  name := "Spark-Test"
  version := "1.0"
  scalaVersion := "2.11.4"
)

lazy val app = (project in file("app")).
  settings(commonSettings: _*).
  settings(
    libraryDependencies ++= Seq(
      "org.apache.spark" %% "spark-core" % "1.2.0",
      "org.apache.spark" %% "spark-streaming" % "1.2.0",
      "org.apache.spark" % "spark-streaming-twitter_2.10" % "1.2.0"
    )
  )

When I click "Build Artifact..." in IntelliJ IDEA I get a JAR file. But I end up with the same error...

I'm new to SBT and not very experimented with IntelliJ IDE.

Thanks.

解决方案

Finally I totally skip using IntelliJ IDEA to avoid generating noise in my global understanding :)

I started reading the official SBT tutorial.

I created my project with the following file structure :

my-project/project/assembly.sbt
my-project/src/main/scala/myPackage/MyMainObject.scala
my-project/build.sbt

Added the sbt-assembly plugin in my assembly.sbt file. Allowing me to build a fat JAR :

addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.12.0")

My minimal build.sbt looks like :

lazy val root = (project in file(".")).
  settings(
    name := "my-project",
    version := "1.0",
    scalaVersion := "2.11.4",
    mainClass in Compile := Some("myPackage.MyMainObject")        
  )

val sparkVersion = "1.2.0"

libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-core" % sparkVersion % "provided",
  "org.apache.spark" %% "spark-streaming" % sparkVersion % "provided",
  "org.apache.spark" %% "spark-streaming-twitter" % sparkVersion
)

// META-INF discarding
mergeStrategy in assembly <<= (mergeStrategy in assembly) { (old) =>
   {
    case PathList("META-INF", xs @ _*) => MergeStrategy.discard
    case x => MergeStrategy.first
   }
}

Note: The % "provided" means not to include the dependency in the final fat JAR (those libraries are already included in my workers)

Note: META-INF discarding inspired by this answser.

Note: Meaning of % and %%

Now I can build my fat JAR using SBT (how to install it) by running the following command in my /my-project root folder:

sbt assembly

My fat JAR is now located in the new generated /target folder :

/my-project/target/scala-2.11/my-project-assembly-1.0.jar

Hope that helps someone else.


For those who wants to embeed SBT within IntelliJ IDE: How to run sbt-assembly tasks from within IntelliJ IDEA?

这篇关于如何在 IntelliJ IDEA 中使用 SBT 构建 Uber JAR(Fat JAR)?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆