找不到资源u'tokenizers/punkt/english.pickle' [英] Resource u'tokenizers/punkt/english.pickle' not found

查看:65
本文介绍了找不到资源u'tokenizers/punkt/english.pickle'的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我的代码:

import nltk.data
tokenizer = nltk.data.load('nltk:tokenizers/punkt/english.pickle')

错误消息:

[ec2-user@ip-172-31-31-31 sentiment]$ python mapper_local_v1.0.py
Traceback (most recent call last):
File "mapper_local_v1.0.py", line 16, in <module>

    tokenizer = nltk.data.load('nltk:tokenizers/punkt/english.pickle')

File "/usr/lib/python2.6/site-packages/nltk/data.py", line 774, in load

    opened_resource = _open(resource_url)

File "/usr/lib/python2.6/site-packages/nltk/data.py", line 888, in _open

    return find(path_, path + ['']).open()

File "/usr/lib/python2.6/site-packages/nltk/data.py", line 618, in find

    raise LookupError(resource_not_found)

LookupError:

Resource u'tokenizers/punkt/english.pickle' not found.  Please
use the NLTK Downloader to obtain the resource:

    >>>nltk.download()

Searched in:
- '/home/ec2-user/nltk_data'
- '/usr/share/nltk_data'
- '/usr/local/share/nltk_data'
- '/usr/lib/nltk_data'
- '/usr/local/lib/nltk_data'
- u''

我正在尝试在Unix机器上运行该程序:

I'm trying to run this program in Unix machine:

根据错误消息,我从Unix机器登录python shell,然后使用以下命令:

As per the error message, I logged into python shell from my unix machine then I used the below commands:

import nltk
nltk.download()

然后我使用d-down loader和l-list选项下载了所有可用的内容,但问题仍然存在.

and then I downloaded all the available things using d- down loader and l- list options but still the problem persists.

我尽力在Internet上找到解决方案,但是我得到了与上述步骤中提到的解决方案相同的解决方案.

I tried my best to find the solution in internet but I got the same solution what I did as I mentioned in my above steps.

推荐答案

我得到了解决方案:

import nltk
nltk.download()

启动NLTK下载器后

d)下载l)列表u)更新c)配置h)帮助q)退出

Downloader> d

once the NLTK Downloader starts

d) Download l) List u) Update c) Config h) Help q) Quit

Downloader> d

下载哪个软件包(l = list; x = cancel)? 标识符> punkt

Download which package (l=list; x=cancel)? Identifier> punkt

这篇关于找不到资源u'tokenizers/punkt/english.pickle'的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆