从 json 脚本输出中抓取 [英] scrapy from script output in json

查看:38
本文介绍了从 json 脚本输出中抓取的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我在 python 脚本中运行 scrapy

I am running scrapy in a python script

def setup_crawler(domain):
    dispatcher.connect(stop_reactor, signal=signals.spider_closed)
    spider = ArgosSpider(domain=domain)
    settings = get_project_settings()
    crawler = Crawler(settings)
    crawler.configure()
    crawler.crawl(spider)
    crawler.start()
    reactor.run()

它运行成功并停止,但结果在哪里?我想要json格式的结果,我该怎么做?

it runs successfully and stops but where is the result ? I want the result in json format, how can I do that?

result = responseInJSON

就像我们使用命令一样

scrapy crawl argos -o result.json -t json

推荐答案

您需要设置FEED_FORMATFEED_URI 手动设置:

You need to set FEED_FORMAT and FEED_URI settings manually:

settings.overrides['FEED_FORMAT'] = 'json'
settings.overrides['FEED_URI'] = 'result.json'

如果您想将结果放入一个变量中,您可以定义一个 Pipeline 将项目收集到列表中的类.使用 spider_closed 信号处理程序查看结果:

If you want to get the results into a variable you can define a Pipeline class that would collect items into the list. Use the spider_closed signal handler to see the results:

import json

from twisted.internet import reactor
from scrapy.crawler import Crawler
from scrapy import log, signals
from scrapy.utils.project import get_project_settings


class MyPipeline(object):
    def process_item(self, item, spider):
        results.append(dict(item))

results = []
def spider_closed(spider):
    print results

# set up spider    
spider = TestSpider(domain='mydomain.org')

# set up settings
settings = get_project_settings()
settings.overrides['ITEM_PIPELINES'] = {'__main__.MyPipeline': 1}

# set up crawler
crawler = Crawler(settings)
crawler.signals.connect(spider_closed, signal=signals.spider_closed)
crawler.configure()
crawler.crawl(spider)

# start crawling
crawler.start()
log.start()
reactor.run() 

仅供参考,看看 Scrapy 解析命令行参数.

FYI, look at how Scrapy parses command-line arguments.

另见:在 Python 的同一进程中捕获标准输出.

这篇关于从 json 脚本输出中抓取的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆