如何在Spring-boot多模块Java 11项目中添加Spark依赖项 [英] How to add Spark dependencies in spring-boot multi module Java 11 project

查看:445
本文介绍了如何在Spring-boot多模块Java 11项目中添加Spark依赖项的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

每当我在多模块项目中添加module-info.java时,我都无法导入我的Spark依赖项-其他所有内容似乎都在起作用

Whenever I am adding a module-info.java in my multi-module project I cannot import my Spark dependencies - everything else seems to be working

<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_2.12</artifactId>
    <version>3.0.0-preview2</version>
</dependency>
<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-sql_2.12</artifactId>
    <version>3.0.0-preview2</version>
</dependency>

IntelliJ尝试读取Maven依赖关系而没有任何结果.

IntelliJ tries to readd Maven Dependency without any result.

我的模块信息如下:

module common {
    exports [...] 
    requires lombok;
    requires spring.data.jpa;
    requires spring.data.commons;
    requires org.apache.commons.lang3;
    requires spring.context;
    requires spring.web;
    requires spring.security.core;
    requires com.google.common;
    requires org.json;
    requires spring.core;
    requires spring.beans;
    requires com.fasterxml.jackson.core;
    requires com.fasterxml.jackson.databind;
    requires spring.jcl;
    requires spring.webmvc;
    requires mongo.java.driver;
    requires org.hibernate.orm.core;
    requires com.fasterxml.jackson.dataformat.csv;
    requires java.sql;
}

也无法在我的module-info.java中添加org.apache.*.

It is not possible to add org.apache.* in my module-info.java either.

Spark是否可能尚未为Jigsaw模块和Java 9+准备就绪?

Is it possible that Spark is not ready for Jigsaw modules and Java 9+?

推荐答案

Jigsaw模块和Java 9+可能没有准备好火花吗?

Is it possible that spark is not ready for Jigsaw modules and Java 9+?

对于spark确实适用.我可以保证的两个直接原因是:

It does hold true for spark. Two straight reasons that I can vouch for are:

  1. 他们没有条目

  1. They do not have an entry for

Automatic-Module-Name: <module-name> 

在工件的MANIFEST.MF文件中.

如果尝试使用jar工具

jar --describe-module --file=<complete-path>/spark-core_2.12-3.0.0-preview2.jar

由于与


到达此处后,很少有有用的资源:


Few resources that might be useful once you reach here:

  • The reason why deriving automatic module name fails for spark artifacts
  • A way to update a jar manually with the MANIFEST entry
  • Spark's progress to Build and Run on JDK-11

这篇关于如何在Spring-boot多模块Java 11项目中添加Spark依赖项的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆