从Django调用Scrapy Spider [英] Calling Scrapy Spider from Django

查看:141
本文介绍了从Django调用Scrapy Spider的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我在同一工作区中有一个带有django和scrapy文件夹的项目:

I have a project with a django and scrapy folder in the same workspace:

my_project/
    django_project/
        django_project/
            settings.py
        app1/
        app2/
        manage.py
        ...
    scrapy_project/
        scrapy_project/
            settings.py
        scrapy.cfg
        ...

我已经将scrapy与django app1模型连接起来,因此每次我运行Spider时,它都会将收集的数据存储在我的postgresql数据库中。
这是我的scrapy项目可以访问django模型的方式

I've already connected scrapy with my django app1 model so every time I run my spider, it stores the collected data in my postgresql db. This is how my scrapy project can access to the django model

#in my_project/scrapy_project/scrapy_project/settings.py
import sys
import os
import django

sys.path.append('/../../django_project')
os.environ['DJANGO_SETTINGS_MODULE'] = 'django_project.settings'
django.setup()

当我从命令行调用Spider时,一切正常,但是当我想要从Django视图或Django中的Celery任务以脚本形式调用Spider时,例如:

Everything works great when I call the spider from the command line, but when I wanted to call the spider as a script from a django view or a Celery task in django, for example:

from scrapy.crawler import CrawlerProcess
from scrapy.utils.project import get_project_settings
process = CrawlerProcess(get_project_settings())
process.crawl('spider_name')
process.start()

我得到一个错误:

KeyError: 'Spider not found: spider_name'

我想我要告诉Django Scrapy locat在哪里ed(就像我在scrapy设置中所做的那样),但我不知道该怎么做。
老实说,我什至不确定我如何为此项目设计文件夹结构。

I think I'm suppose to tell Django where is Scrapy located (as I've done in scrapy settings), but I don't know how. To be honest, I'm not even sure that how I design my folder structure for this project is the correct choice.

推荐答案

以下示例来自 scrapy doc

from my_projec.scrapy_project.spiders import MySpider
...
process.crawl(MySpider)

这篇关于从Django调用Scrapy Spider的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆