如何在Spring-boot多模块Java 11项目中添加Spark依赖项 [英] How to add Spark dependencies in spring-boot multi module Java 11 project
问题描述
每当我在多模块项目中添加module-info.java时,我都无法导入我的Spark依赖项-其他所有内容似乎都在起作用
Whenever I am adding a module-info.java in my multi-module project I cannot import my Spark dependencies - everything else seems to be working
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.12</artifactId>
<version>3.0.0-preview2</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.12</artifactId>
<version>3.0.0-preview2</version>
</dependency>
IntelliJ尝试读取Maven依赖关系而没有任何结果.
IntelliJ tries to readd Maven Dependency without any result.
我的模块信息如下:
module common {
exports [...]
requires lombok;
requires spring.data.jpa;
requires spring.data.commons;
requires org.apache.commons.lang3;
requires spring.context;
requires spring.web;
requires spring.security.core;
requires com.google.common;
requires org.json;
requires spring.core;
requires spring.beans;
requires com.fasterxml.jackson.core;
requires com.fasterxml.jackson.databind;
requires spring.jcl;
requires spring.webmvc;
requires mongo.java.driver;
requires org.hibernate.orm.core;
requires com.fasterxml.jackson.dataformat.csv;
requires java.sql;
}
也无法在我的module-info.java中添加org.apache.*.
It is not possible to add org.apache.* in my module-info.java either.
Spark是否可能尚未为Jigsaw模块和Java 9+准备就绪?
Is it possible that Spark is not ready for Jigsaw modules and Java 9+?
推荐答案
Jigsaw模块和Java 9+可能没有准备好火花吗?
Is it possible that spark is not ready for Jigsaw modules and Java 9+?
对于spark
确实适用.我可以保证的两个直接原因是:
It does hold true for spark
. Two straight reasons that I can vouch for are:
-
他们没有条目
They do not have an entry for
Automatic-Module-Name: <module-name>
在工件的MANIFEST.MF
文件中.
如果尝试使用jar
工具
jar --describe-module --file=<complete-path>/spark-core_2.12-3.0.0-preview2.jar