在多项目设置中为sbt控制台加载正确的依赖项,从而导致derby安全异常 [英] loading the right dependencies for sbt console in multi project setup causing derby security exception
问题描述
我有一个概述了 https://github.com的SBT多项目设置/geoHeil/sf-sbt-multiproject-dependency-problem ,并希望能够在根项目中执行sbt console
.
I have a SBT multi project setup outlined https://github.com/geoHeil/sf-sbt-multiproject-dependency-problem and want to be able to execute sbt console
in the root project.
执行时:
import org.apache.spark.sql.SparkSession
val spark = SparkSession.builder().master("local[*]").enableHiveSupport.getOrCreate
spark.sql("CREATE database foo")
在根控制台中,错误是:
in the root console the error is:
java.lang.NoClassDefFoundError: Could not initialize class org.apache.derby.jdbc.EmbeddedDriver
奇怪的是,它在子项目中也能正常工作:
Strangely, it works just fine in the sub project:
sbt
project common
console
,然后粘贴相同的代码.
and now pasting the same code.
- 如何修复sbt控制台以直接加载正确的依赖项?
- 如何直接从子项目中加载控制台? sbt common/console似乎无法解决问题.
以下最重要的设置:
lazy val global = project
.in(file("."))
.settings(
settings,
libraryDependencies ++= commonDependencies
)
.aggregate(
common
)
.dependsOn(
common
)
lazy val common = project
.settings(
name := "common",
settings,
libraryDependencies ++= commonDependencies
)
lazy val dependencies =
new {
val sparkV = "2.3.0"
val sparkBase = "org.apache.spark" %% "spark-core" % sparkV % "provided"
val sparkSql = "org.apache.spark" %% "spark-sql" % sparkV % "provided"
val sparkHive = "org.apache.spark" %% "spark-hive" % sparkV % "provided"
}
lazy val commonDependencies = Seq(
dependencies.sparkBase,
dependencies.sparkHive,
dependencies.sparkSql
)
lazy val settings = commonSettings
lazy val commonSettings = Seq(
fork := true,
run in Compile := Defaults
.runTask(fullClasspath in Compile, mainClass.in(Compile, run), runner.in(Compile, run))
.evaluated
)
相关问题
- SBT多项目中的传递依赖项错误
- SBT测试不适用于火花测试
- Transitive dependency errors in SBT multi-project
- SBT test does not work for spark test
related questions
奇怪的是:对于Spark版本2.2.0,此设置可以正常运行.只有2.2.1/2.3.0会导致这些问题,但是在单个项目设置中或在正确的项目中启动控制台时都能正常工作.
The strange thing is: for spark version 2.2.0 this setup works just fine. Only 2.2.1 / 2.3.0 cause these problems, but work fine in a single project setup or when the console is started in the right project.
也
java.security.AccessControlException: access denied org.apache.derby.security.SystemPermission( "engine", "usederbyinternals" )'
在堆栈跟踪中提到了
.
is mentioned in the stack trace.
推荐答案
Actually SBT test does not work for spark test using the code of:
if (appName === "dev") {
System.setSecurityManager(null)
}
正在修复它以进行开发.
is fixing it for development.
https://github.com/holdenk/spark-testing-base/issues/148 https://issues.apache.org/jira/browse/SPARK-22918
这篇关于在多项目设置中为sbt控制台加载正确的依赖项,从而导致derby安全异常的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!