无法将补丁 LUCENE-2899.patch 应用到 Windows 上的 SOLR [英] Can not apply patch LUCENE-2899.patch to SOLR on Windows

查看:14
本文介绍了无法将补丁 LUCENE-2899.patch 应用到 Windows 上的 SOLR的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试将补丁 LUCENE-2899.patch 应用到 Solr.

I am trying to apply patch LUCENE-2899.patch to Solr.

我已经这样做了:

  1. 从官方仓库克隆的 solr(我在主分支上)
  2. 下载并安装了 ant 和 GNU 补丁,我在这里得到它 http://gnuwin32.sourceforge.net/packages/patch.htm
  3. 将 Ant 和 GNU 补丁放入 PATH 环境变量.
  4. 我得到了这个......

```

D:utilssolr_masterlucene-solr>patch -p1 -i LUCENE-2899.patch --dry-run
patching file dev-tools/idea/.idea/ant.xml
Assertion failed: hunk, file ../patch-2.5.9-src/patch.c, line 354

This application has requested the Runtime to terminate it in an unusual way.
Please contact the application's support team for more information.

```

更新 1

我正在尝试编译,但构建失败.

I am trying to compile, but build failed.

D:utilssolr_masterlucene-solr>ant compile
Buildfile: D:utilssolr_masterlucene-solruild.xml

BUILD FAILED
D:utilssolr_masterlucene-solruild.xml:21: The following error occurred while executing this line:
D:utilssolr_masterlucene-solrlucenecommon-build.xml:623: java.lang.NullPointerException
        at java.util.Arrays.stream(Arrays.java:5004)
        at java.util.stream.Stream.of(Stream.java:1000)
        at java.util.stream.ReferencePipeline$7$1.accept(ReferencePipeline.java:267)
        at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
        at java.util.Spliterators$ArraySpliterator.forEachRemaining(Spliterators.java:948)
        at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
        at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
        at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:545)
        at java.util.stream.AbstractPipeline.evaluateToArrayNode(AbstractPipeline.java:260)
        at java.util.stream.ReferencePipeline.toArray(ReferencePipeline.java:438)
        at org.apache.tools.ant.util.ChainedMapper.lambda$mapFileName$1(ChainedMapper.java:36)
        at java.util.stream.ReduceOps$1ReducingSink.accept(ReduceOps.java:80)
        at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1382)
        at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
        at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
        at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)
        at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
        at java.util.stream.ReferencePipeline.reduce(ReferencePipeline.java:484)
        at org.apache.tools.ant.util.ChainedMapper.mapFileName(ChainedMapper.java:35)
        at org.apache.tools.ant.util.CompositeMapper.lambda$mapFileName$0(CompositeMapper.java:32)
        at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
        at java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:175)
        at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1382)
        at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
        at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
        at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:545)
        at java.util.stream.AbstractPipeline.evaluateToArrayNode(AbstractPipeline.java:260)
        at java.util.stream.ReferencePipeline.toArray(ReferencePipeline.java:438)
        at org.apache.tools.ant.util.CompositeMapper.mapFileName(CompositeMapper.java:33)
        at org.apache.tools.ant.taskdefs.PathConvert.execute(PathConvert.java:363)
        at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:292)
        at sun.reflect.GeneratedMethodAccessor4.invoke(Unknown Source)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106)
        at org.apache.tools.ant.Task.perform(Task.java:346)
        at org.apache.tools.ant.Target.execute(Target.java:448)
        at org.apache.tools.ant.helper.ProjectHelper2.parse(ProjectHelper2.java:172)
        at org.apache.tools.ant.taskdefs.ImportTask.importResource(ImportTask.java:221)
        at org.apache.tools.ant.taskdefs.ImportTask.execute(ImportTask.java:165)
        at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:292)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106)
        at org.apache.tools.ant.Task.perform(Task.java:346)
        at org.apache.tools.ant.Target.execute(Target.java:448)
        at org.apache.tools.ant.helper.ProjectHelper2.parse(ProjectHelper2.java:183)
        at org.apache.tools.ant.ProjectHelper.configureProject(ProjectHelper.java:93)
        at org.apache.tools.ant.Main.runBuild(Main.java:824)
        at org.apache.tools.ant.Main.startAnt(Main.java:228)
        at org.apache.tools.ant.launch.Launcher.run(Launcher.java:283)
        at org.apache.tools.ant.launch.Launcher.main(Launcher.java:101)

Total time: 0 seconds

更新 2

我已经从

https://builds.apache.org/job/Solr-Artifacts-7.3/lastSuccessfulBuild/artifact/solr/package/ and https://builds.apache.org/job/Solr-Artifacts-master/lastSuccessfulBuild/artifact/solr/package/

但对于 7.3 版本和 8.0(master) 版本我都没有在 contrib 目录中看到 opennlp 目录.我在哪里可以找到它?

but neither for 7.3 version nor for 8.0(master) version I don't see opennlp dir in contrib dir. Where can I find it?

更新 3

我已经从这里下载的 master 分支女巫运行版本 https://builds.apache.org/job/Solr-Artifacts-master/lastSuccessfulBuild/artifact/solr/package/ 我试图在这篇文章中像绅士一样运行 OpenNLP:

I have run version from master branch witch I have downloaded here https://builds.apache.org/job/Solr-Artifacts-master/lastSuccessfulBuild/artifact/solr/package/ and I have trying to run OpenNLP like gentleman in this post:

将 openNLP 与 Solr 集成时出现异常

但我和他有同样的错误.

But I have the same error as he.

numberplate_shard1_replica_n1:
org.apache.solr.common.SolrException:org.apache.solr.common.SolrException:>无法为核心 numberplate_shard1_replica_n1 加载配置:无法加载架构>托管架构:插件初始化失败 [schema.xml] fieldType >text_opennlp_nvf":[schema.xml] 分析器/标记器的插件初始化失败:>错误实例化类:'org.apache.lucene.analysis.opennlp.OpenNLPTokenizerFactory'

numberplate_shard1_replica_n1:
org.apache.solr.common.SolrException:org.apache.solr.common.SolrException: >Could not load conf for core numberplate_shard1_replica_n1: Can't load schema >managed-schema: Plugin init failure for [schema.xml] fieldType >"text_opennlp_nvf": Plugin init failure for [schema.xml] analyzer/tokenizer: >Error instantiating class: 'org.apache.lucene.analysis.opennlp.OpenNLPTokenizerFactory'

如果补丁 LUCENE-2899 被合并到 master 为什么我会出现这个错误?

If patch LUCENE-2899 is merged into master why I have this error?

更新 5

我重新启动了 solr 并且错误消失了.但是...

I have restarted solr and errors were gone. But...

我试图添加字段(到托管模式)以形成示例(https://wiki.apache.org/solr/OpenNLP ) :

I was trying to add fields ( to managed-schema ) to form example ( https://wiki.apache.org/solr/OpenNLP ) :

<fieldType name="text_opennlp" class="solr.TextField">
      <analyzer>
        <tokenizer class="solr.OpenNLPTokenizerFactory"
          sentenceModel="opennlp/en-sent.bin"
          tokenizerModel="opennlp/en-token.bin"
        />
      </analyzer>
    </fieldType>

    <field name="content" type="text_opennlp" indexed="true" termOffsets="true" stored="true" termPayloads="true" termPositions="true" docValues="false" termVectors="true" multiValued="true" required="true"/>

但是当我尝试在云模式下运行 Solr 时,我得到了这个:

But when I try to run Solr in Cloud mode I got this:

D:utilssolr-7.3.0-7solr-7.3.0-7in>solr -e cloud

Welcome to the SolrCloud example!

This interactive session will help you launch a SolrCloud cluster on your local workstation.
To begin, how many Solr nodes would you like to run in your local cluster? (specify 1-4 nodes) [2]:
1
Ok, let's start up 1 Solr nodes for your example SolrCloud cluster.
Please enter the port for node1 [8983]:

Solr home directory D:utilssolr-7.3.0-7solr-7.3.0-7examplecloud
ode1solr already exists.

Starting up Solr on port 8983 using command:
"D:utilssolr-7.3.0-7solr-7.3.0-7insolr.cmd" start -cloud -p 8983 -s "D:utilssolr-7.3.0-7solr-7.3.0-7examplecloud
ode1solr"

Waiting up to 30 to see Solr running on port 8983
Started Solr server on port 8983. Happy searching!
INFO  - 2018-03-26 14:42:26.961; org.apache.solr.client.solrj.impl.ZkClientClusterStateProvider; Cluster at localhost:9983 ready

Now let's create a new collection for indexing documents in your 1-node cluster.
Please provide a name for your new collection: [gettingstarted]
numberplate

Collection 'numberplate' already exists!
Do you want to re-use the existing collection or create a new one? Enter 1 to reuse, 2 to create new [1]:
1

Enabling auto soft-commits with maxTime 3 secs using the Config API

POSTing request to Config API: http://localhost:8983/solr/numberplate/config
{"set-property":{"updateHandler.autoSoftCommit.maxTime":"3000"}}

ERROR: Error from server at http://localhost:8983/solr: Expected mime type application/octet-stream but got text/html. <html>
<head>
<meta http-equiv="Content-Type" content="text/html;charset=utf-8"/>
<title>Error 404 Not Found</title>
</head>
<body><h2>HTTP ERROR 404</h2>
<p>Problem accessing /solr/numberplate/config. Reason:
<pre>    Not Found</pre></p>
</body>
</html>




SolrCloud example running, please visit: http://localhost:8983/solr


D:utilssolr-7.3.0-7solr-7.3.0-7in>

更新 6

我创建了新的集合,但得到了更精确的错误:

I have created new collection and I get more precise error:

test_collection_shard1_replica_n1:> org.apache.solr.common.SolrException:org.apache.solr.common.SolrException:> 无法加载核心 test_collection_shard1_replica_n1 的配置:无法加载 > 模式管理模式:org.apache.solr.core.SolrResourceNotFoundException: > 在类路径或/configs/_default"中找不到资源opennlp/en-sent.bin",> cwd=D:utilssolr-7.3.0-7solr-7.3.0-7服务器请检查您的日志以获取更多信息

test_collection_shard1_replica_n1: > org.apache.solr.common.SolrException:org.apache.solr.common.SolrException: > Could not load conf for core test_collection_shard1_replica_n1: Can't load > schema managed-schema: org.apache.solr.core.SolrResourceNotFoundException: > Can't find resource 'opennlp/en-sent.bin' in classpath or '/configs/_default', > cwd=D:utilssolr-7.3.0-7solr-7.3.0-7server Please check your logs for more information

也许我需要在某处复制 OpenNLP 模型 http://opennlp.sourceforge.net/models-1.5/

Maybe I need to copy somewhere OpenNLP models http://opennlp.sourceforge.net/models-1.5/

但是我可以把这个模型放在哪里?

But where can I put this models?

你能帮我吗?我做错了什么?

Can you help me? What I do wrong?

推荐答案

如您所见 LUCENE-2899,补丁已经应用到8.0(master)和7.3.

As you can see on LUCENE-2899, the patch is already applied to 8.0 (master), as well as 7.3.

您可以在 Solr-Artifacts-master for(当前)8.0Solr-Artifacts-7.3 for 7.3.

opennlp 库捆绑在工件中:

The opennlp libraries are bundled inside the artifacts:

solr-8.0.0-3304 find . -name '*nlp*'
[...]
./contrib/langid/lib/opennlp-tools-1.8.3.jar
./contrib/analysis-extras/lib/opennlp-maxent-3.0.3.jar
./contrib/analysis-extras/lib/opennlp-tools-1.8.3.jar
./contrib/analysis-extras/lucene-libs/lucene-analyzers-opennlp-8.0.0-3304.jar

然后您必须告诉 Solr 加载这些 jars,其中 你可以通过solrconfig.xml来做.

You then have to tell Solr to load these jars, which you can do through solrconfig.xml.

<lib dir="../../../contrib/analysis-extras/lib/" regex="opennlp-.*.jar" />
<lib dir="../../../contrib/analysis-extras/lucene-libs/lucene-analyzers-opennlp-.*.jar" regex=".*.jar" />

确认 jar 已按您在 Solr 日志文件中的预期加载.

Confirm that the jars are loaded as you expect in Solr's log file.

这篇关于无法将补丁 LUCENE-2899.patch 应用到 Windows 上的 SOLR的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆