设置KieContainer kie.maven.settings.custom [英] Setting KieContainer kie.maven.settings.custom
问题描述
我在Apache Spark中使用带有Scala的流口水。为了让KIE知道查看我的私有关系存储库中的已编译规则,我尝试将系统属性 kie.maven.settings.custom设置为工作目录中包含的自定义settings.xml文件。 jar文件。
I am using drools with scala in apache spark. In order for KIE to know to look at my private nexus repository for the compiled rules, I am attempting to set the system property: "kie.maven.settings.custom" to a custom settings.xml file which is included in the working directory of the jar file.
我试图在运行时设置此系统属性,但结果不一。这是我设置 kie.maven.settings.custom的代码。
I am attempting to set this system property at run time, but am having mixed results. Here is my code to set "kie.maven.settings.custom"
val userDir = System.getProperty("user.dir")
val customPath = Paths.get(userDir, "settings.xml")
System.setProperty("kie.maven.settings.custom", customPath.toString)
在我的本地计算机上运行时,这似乎可以正常工作,但是在群集中它无法工作。
This appears to work properly when running on my local machine, however, in the cluster it is not working.
我的问题是:我是否应该按预期工作?还是为KIE获取系统属性创造了竞争条件?
My question is: Should I expect this to work as I am expecting? Or does this create a race condition for KIE getting the system property?
我的自定义settings.xml文件如下:
My custom settings.xml file looks like:
<?xml version="1.0" encoding="UTF-8"?>
<profilesXml xmlns="http://maven.apache.org/PROFILES/1.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/PROFILES/1.0.0 http://maven.apache.org/xsd/profiles-1.0.0.xsd">
<profiles>
<profile>
<id>profile-1</id>
<activation>
<activeByDefault>true</activeByDefault>
</activation>
<repositories>
<repository>
<id>temp-nexus</id>
<url>http://nexus.path..com:8081/repository/maven-releases</url>
<releases>
<updatePolicy>always</updatePolicy>
</releases>
</repository>
</repositories>
</profile>
</profiles>
</profilesXml>
推荐答案
我能够弄清楚这一点,它变成了
I was able to figure this out, and it turned out to be a Spark issue, and nothing to do with Drools.
要解决此问题,我必须将自定义settings.xml添加到Spark作业类路径中,并设置一些额外的Spark属性。在livy配置文件中,如下所示:
To solve this I had to add my custom settings.xml to the Spark jobs class path, and to set some extra Spark properties. In the livy configuration file this look like:
"conf":{
"spark.driver.extraJavaOptions": "-Dkie.maven.settings.custom=settings.xml",
"spark.executor.extraJavaOptions": "-Dkie.maven.settings.custom=settings.xml"
}
我同时设置了驱动程序和执行程序的附加选项,尽管仅设置执行程序的附加选项就足够了,但是我还没有测试过。
I set this both for the driver and executor extra options, although just setting the executor extra options should have been sufficient, but I have not tested that.
这篇关于设置KieContainer kie.maven.settings.custom的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!