如何在 Java 中使用 OpenNLP? [英] How to use OpenNLP with Java?

查看:18
本文介绍了如何在 Java 中使用 OpenNLP?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想对一个英文句子进行 POSTag 并做一些处理.我想使用 openNLP.我已经安装了

I want to POStag an English sentence and do some processing. I would like to use openNLP. I have it installed

当我执行命令时

I:WorkshopProgramming
lpopennlp-tools-1.5.0-binopennlp-tools-1.5.0>java -jar opennlp-tools-1.5.0.jar POSTagger modelsen-pos-maxent.bin < Text.txt

它给出输出 POSTagging Text.txt 中的输入

It gives output POSTagging the input in Text.txt

    Loading POS Tagger model ... done (4.009s)
My_PRP$ name_NN is_VBZ Shabab_NNP i_FW am_VBP 22_CD years_NNS old._.


Average: 66.7 sent/s
Total: 1 sent
Runtime: 0.015s

我希望它安装正确吗?

现在如何从 Java 应用程序内部执行此 POSTagging?我已将 openNLPtools、jwnl、maxent jar 添加到项目中,但如何调用 POSTagging?

Now how do i do this POStagging from inside a java application? I have added the openNLPtools, jwnl, maxent jar to the project but how do i invoke the POStagging?

推荐答案

以下是我整理的一些(旧)示例代码,并附上现代化的代码:

Here's some (old) sample code I threw together, with modernized code to follow:

package opennlp;

import opennlp.tools.cmdline.PerformanceMonitor;
import opennlp.tools.cmdline.postag.POSModelLoader;
import opennlp.tools.postag.POSModel;
import opennlp.tools.postag.POSSample;
import opennlp.tools.postag.POSTaggerME;
import opennlp.tools.tokenize.WhitespaceTokenizer;
import opennlp.tools.util.ObjectStream;
import opennlp.tools.util.PlainTextByLineStream;

import java.io.File;
import java.io.IOException;
import java.io.StringReader;

public class OpenNlpTest {
public static void main(String[] args) throws IOException {
    POSModel model = new POSModelLoader().load(new File("en-pos-maxent.bin"));
    PerformanceMonitor perfMon = new PerformanceMonitor(System.err, "sent");
    POSTaggerME tagger = new POSTaggerME(model);

    String input = "Can anyone help me dig through OpenNLP's horrible documentation?";
    ObjectStream<String> lineStream =
            new PlainTextByLineStream(new StringReader(input));

    perfMon.start();
    String line;
    while ((line = lineStream.read()) != null) {

        String whitespaceTokenizerLine[] = WhitespaceTokenizer.INSTANCE.tokenize(line);
        String[] tags = tagger.tag(whitespaceTokenizerLine);

        POSSample sample = new POSSample(whitespaceTokenizerLine, tags);
        System.out.println(sample.toString());

        perfMon.incrementCounter();
    }
    perfMon.stopAndPrintFinalResult();
}
}

输出为:

Loading POS Tagger model ... done (2.045s)
Can_MD anyone_NN help_VB me_PRP dig_VB through_IN OpenNLP's_NNP horrible_JJ documentation?_NN

Average: 76.9 sent/s 
Total: 1 sent
Runtime: 0.013s

这基本上是从作为 OpenNLP 一部分包含的 POSTaggerTool 类工作的.sample.getTags() 是一个 String 数组,其中包含标签类型本身.

This is basically working from the POSTaggerTool class included as part of OpenNLP. The sample.getTags() is a String array that has the tag types themselves.

这需要直接访问训练数据的文件,这真的非常糟糕.

This requires direct file access to the training data, which is really, really lame.

为此更新的代码库有点不同(可能更有用.)

An updated codebase for this is a little different (and probably more useful.)

首先,一个 Maven POM:

First, a Maven POM:

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>

    <groupId>org.javachannel</groupId>
    <artifactId>opennlp-example</artifactId>
    <version>1.0-SNAPSHOT</version>
    <dependencies>
        <dependency>
            <groupId>org.apache.opennlp</groupId>
            <artifactId>opennlp-tools</artifactId>
            <version>1.6.0</version>
        </dependency>
        <dependency>
            <groupId>org.testng</groupId>
            <artifactId>testng</artifactId>
            <version>[6.8.21,)</version>
            <scope>test</scope>
        </dependency>
    </dependencies>
    <build>
        <plugins>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-compiler-plugin</artifactId>
                <version>3.1</version>
                <configuration>
                    <source>1.8</source>
                    <target>1.8</target>
                </configuration>
            </plugin>
        </plugins>
    </build>
</project>

这是作为测试编写的代码,因此位于./src/test/java/org/javachannel/opennlp/example:

And here's the code, written as a test, therefore located in ./src/test/java/org/javachannel/opennlp/example:

package org.javachannel.opennlp.example;

import opennlp.tools.cmdline.PerformanceMonitor;
import opennlp.tools.postag.POSModel;
import opennlp.tools.postag.POSSample;
import opennlp.tools.postag.POSTaggerME;
import opennlp.tools.tokenize.WhitespaceTokenizer;
import org.testng.annotations.DataProvider;
import org.testng.annotations.Test;

import java.io.File;
import java.io.FileOutputStream;
import java.io.IOException;
import java.net.URL;
import java.nio.channels.Channels;
import java.nio.channels.ReadableByteChannel;
import java.util.stream.Stream;

public class POSTest {
    private void download(String url, File destination) throws IOException {
        URL website = new URL(url);
        ReadableByteChannel rbc = Channels.newChannel(website.openStream());
        FileOutputStream fos = new FileOutputStream(destination);
        fos.getChannel().transferFrom(rbc, 0, Long.MAX_VALUE);
    }

    @DataProvider
    Object[][] getCorpusData() {
        return new Object[][][]{{{
                "Can anyone help me dig through OpenNLP's horrible documentation?"
        }}};
    }

    @Test(dataProvider = "getCorpusData")
    public void showPOS(Object[] input) throws IOException {
        File modelFile = new File("en-pos-maxent.bin");
        if (!modelFile.exists()) {
            System.out.println("Downloading model.");
            download("http://opennlp.sourceforge.net/models-1.5/en-pos-maxent.bin", modelFile);
        }
        POSModel model = new POSModel(modelFile);
        PerformanceMonitor perfMon = new PerformanceMonitor(System.err, "sent");
        POSTaggerME tagger = new POSTaggerME(model);

        perfMon.start();
        Stream.of(input).map(line -> {
            String whitespaceTokenizerLine[] = WhitespaceTokenizer.INSTANCE.tokenize(line.toString());
            String[] tags = tagger.tag(whitespaceTokenizerLine);

            POSSample sample = new POSSample(whitespaceTokenizerLine, tags);

            perfMon.incrementCounter();
            return sample.toString();
        }).forEach(System.out::println);
        perfMon.stopAndPrintFinalResult();
    }
}

这段代码实际上并没有测试任何东西——这是一个冒烟测试,如果有的话——但它应该作为一个起点.另一个(可能)不错的事情是,如果您尚未下载模型,它会为您下载模型.

This code doesn't actually test anything - it's a smoke test, if anything - but it should serve as a starting point. Another (potentially) nice thing is that it downloads a model for you if you don't have it downloaded already.

这篇关于如何在 Java 中使用 OpenNLP?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆