Scrapy 部署停止工作 [英] Scrapy deploy stopped working

查看:48
本文介绍了Scrapy 部署停止工作的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试使用scrapyd部署scrapy项目,但它给了我错误...

I am trying to deploy scrapy project using scrapyd but it is giving me error ...

sudo scrapy deploy default -p eScraper
Building egg of eScraper-1371463750
'build/scripts-2.7' does not exist -- can't clean it
zip_safe flag not set; analyzing archive contents...
eScraperInterface.settings: module references __file__
eScraper.settings: module references __file__
Deploying eScraper-1371463750 to http://localhost:6800/addversion.json
Server response (200):
Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/scrapyd/webservice.py", line 18, in render
    return JsonResource.render(self, txrequest)
  File "/usr/local/lib/python2.7/dist-packages/scrapy/utils/txweb.py", line 10, in render
    r = resource.Resource.render(self, txrequest)
  File "/usr/local/lib/python2.7/dist-packages/twisted/web/resource.py", line 250, in render
    return m(request)
  File "/usr/local/lib/python2.7/dist-packages/scrapyd/webservice.py", line 66, in render_POST
    spiders = get_spider_list(project)
  File "/usr/local/lib/python2.7/dist-packages/scrapyd/utils.py", line 65, in get_spider_list
    raise RuntimeError(msg.splitlines()[-1])
RuntimeError: OSError: [Errno 20] Not a directory: '/tmp/eScraper-1371463750-Lm8HLh.egg/images'

之前我能够正确部署项目,但现在不行.....但是如果使用scrapy crawl spiderName 使用crawl spider 那么就没有问题了...有人可以帮我吗....

Earlier i was able to deploy the project properly but not now..... But if use crawl spider using scrapy crawl spiderName then there is no problem... can some one help me please....

推荐答案

试试这两件事:1.可能是你部署的版本太多,尝试删除一些旧版本2.部署前,删除build文件夹和setup文件

Try these two things: 1. May be you have deployed too many versions , try deleting some older versions 2. before deploying, delete the build folder and the setup file

就运行爬虫而言,如果您运行任何您甚至尚未部署的任意名称的爬虫,scrapyd 将返回OK"响应以及作业 ID.

And as far as running the crawler is concerned if u run crawler of any arbitrary name which u even have not deployed, the scrapyd will return 'OK' response along with the job id.

这篇关于Scrapy 部署停止工作的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆