无法导入 Scrapy 的设置模块或其 scrapy.cfg [英] Cannot import either Scrapy's settings module or its scrapy.cfg

查看:84
本文介绍了无法导入 Scrapy 的设置模块或其 scrapy.cfg的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

这是一篇很长的文章,但经过大量研究后,我找不到解决方案.我在 OSX 10.8 上有一个混合的 Django 1.4.1/Scrapy 0.14.4 项目,我使用 Django 项目的 manage.py 命令控制 Scrapy,如此处.例如,调用

This is a quite lengthy post, but after doing extensive research I couldn't find a solution. I have a mixed Django 1.4.1 / Scrapy 0.14.4 project on OSX 10.8 and I control Scrapy with the Django project's manage.py command as described here. For instance, calling

python manage.py scrapy crawl example_spider 

工作没有问题.现在我要设置 scrapyd 网络服务来部署我的蜘蛛.但是,当我执行

works without a problem. Now I'm at the point where I want to setup the scrapyd web service to deploy my spiders. However, when I execute

python manage.py scrapy server

然后我得到这个异常:

scrapy.exceptions.NotConfigured: Unable to find scrapy.cfg file to infer project data dir

所以,显然 Scrapy 找不到 scrapy.cfg 文件,因为我没有从 Scrapy 项目中执行它.但是,其他 Scrapy 命令可以工作,因为在我的 Django 项目的 settings.py 中,我执行了以下操作:

So, apparently Scrapy cannot find the scrapy.cfg file because I don't execute it from within the Scrapy project. The other Scrapy commands work, however, because in my Django project's settings.py I did the following:

sys.path.append('/absolute/path/to/my/Scrapy/project')
os.environ['SCRAPY_SETTINGS_MODULE'] = 'my_scrapy_project_name.settings'

问题 1: 为什么 Scrapy 在我的设置中检测不到 scrapy.cfg 文件?我该如何解决?

Question 1: Why can't Scrapy detect the scrapy.cfg file in my setup? How can I resolve this?

由于上面提到的东西不起作用,我尝试使用我的 Scrapy 项目目录中的 scrapy 命令来运行 scrapyd 服务器.从我的 Scrapy 项目的顶级目录执行 scrapy server 产生以下结果:

Since the stuff mentioned above doesn't work, I tried to get the scrapyd server running using just the scrapy command from within my Scrapy project directory. Executing scrapy server from the top-level directory of my Scrapy project yields the following:

$ scrapy server
UserWarning: Cannot import scrapy settings module my_scrapy_project_name.settings
warnings.warn("Cannot import scrapy settings module %s" % scrapy_module)
2012-08-31 21:58:31+0200 [-] Log opened.
2012-08-31 21:58:32+0200 [-] Scrapyd web console available at http://localhost:6800/
2012-08-31 21:58:32+0200 [Launcher] Scrapyd started: max_proc=8, runner='scrapyd.runner'
2012-08-31 21:58:32+0200 [-] Site starting on 6800
2012-08-31 21:58:32+0200 [-] Starting factory <twisted.web.server.Site instance at 0x101dd3d88> 

服务器运行没有问题,但是找不到我的 Scrapy 项目的 settings.py 文件,因为不再设置相应的环境变量.这就是我在终端中执行以下操作的原因:

The server is running without a problem, however, the settings.py file of my Scrapy project cannot be found because the respective environment variable is not set anymore. That's why I do the following in my terminal:

export PYTHONPATH=/absolute/path/to/my/Scrapy/project
export SCRAPY_SETTINGS_MODULE=my_scrapy_project_name.settings

很遗憾,这两个命令没有效果.每当我执行 scrapy server(或任何其他 Scrapy 命令)时,我都会收到 Scrapy 无法导入其项目设置模块的消息.

Unfortunately, these two commands have no effect. Whenever I execute scrapy server (or any other Scrapy command), I get the message that Scrapy cannot import its project's settings module.

我的scrapy.cfg目前只有以下内容:

My scrapy.cfg only has the following content at the moment:

[settings]
default = my_scrapy_project_name.settings

[deploy:scrapyd]
url = http://localhost:6800/
project = my_scrapy_project_name

当我尝试将我的 Scrapy 项目部署到 scrapyd 服务器时,它起初似乎可以工作,但后来我意识到没有任何蜘蛛上传,可能是因为设置文件无法被检测到.这是控制台输出:

When I try to deploy my Scrapy project to the scrapyd server, it seems to work at first, but then I realized that none of the spiders have been uploaded, probably because the settings file could not be detected. Here is the console output:

$ scrapy deploy scrapyd -p my_scrapy_project_name
/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-
packages/scrapy/utils/project.py:17: UserWarning: Cannot import scrapy
settings module my_scrapy_project_name.settings
 warnings.warn("Cannot import scrapy settings module %s" %
scrapy_module)
Building egg of event_crawler-1346531706
'build/lib' does not exist -- can't clean it
'build/bdist.macosx-10.6-intel' does not exist -- can't clean it
'build/scripts-2.7' does not exist -- can't clean it
zip_safe flag not set; analyzing archive contents...
Deploying event_crawler-1346531706 to http://localhost:6800/addversion.json
Server response (200):
{"status": "ok", "project": "my_scrapy_project_name", "version": "1346531706", "spiders": 0}

问题2:如何正确导出上面的路径和环境变量,让这个警告消失?

Question 2: How to do the correct export of the path and environment variable above so that this warning disappears?

问题 3: 由于 scrapyd 服务器似乎工作正常,我该如何正确上传我的蜘蛛?

Question 3: Since the scrapyd server seems to work fine though, how can I upload my spiders correctly?

非常感谢!

推荐答案

wiki:

第一个第二个都解决了 django 和 scrapy 冲突设置的问题.

the first one and second one are both addressing the problems with django and scrapy conflicting settings.

希望这有帮助...

甚至这个问题解决了很多设置问题django和scrapy之间

even this question on SO addresses a lot of the settings-problems between django and scrapy

这篇关于无法导入 Scrapy 的设置模块或其 scrapy.cfg的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆