在 Scrapy python 中将参数传递给 process.crawl [英] Passing arguments to process.crawl in Scrapy python

查看:47
本文介绍了在 Scrapy python 中将参数传递给 process.crawl的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想得到与此命令行相同的结果:scrapy crawl linkedin_anonymous -a first=James -a last=Bond -o output.json

I would like to get the same result as this command line : scrapy crawl linkedin_anonymous -a first=James -a last=Bond -o output.json

我的脚本如下:

import scrapy
from linkedin_anonymous_spider import LinkedInAnonymousSpider
from scrapy.crawler import CrawlerProcess
from scrapy.utils.project import get_project_settings

spider = LinkedInAnonymousSpider(None, "James", "Bond")
process = CrawlerProcess(get_project_settings())
process.crawl(spider) ## <-------------- (1)
process.start()

我发现 (1) 中的 process.crawl() 正在创建另一个 LinkedInAnonymousSpider,其中第一个和最后一个是 None(打印在 (2) 中),如果是这样,那么创建对象蜘蛛就没有意义了,如何可以将参数 first 和 last 传递给 process.crawl() 吗?

I found out that process.crawl() in (1) is creating another LinkedInAnonymousSpider where first and last are None (printed in (2)), if so, then there is no point of creating the object spider and how is it possible to pass the arguments first and last to process.crawl()?

linkedin_anonymous :

linkedin_anonymous :

from logging import INFO

import scrapy

class LinkedInAnonymousSpider(scrapy.Spider):
    name = "linkedin_anonymous"
    allowed_domains = ["linkedin.com"]
    start_urls = []

    base_url = "https://www.linkedin.com/pub/dir/?first=%s&last=%s&search=Search"

    def __init__(self, input = None, first= None, last=None):
        self.input = input  # source file name
        self.first = first
        self.last = last

    def start_requests(self):
        print self.first ## <------------- (2)
        if self.first and self.last: # taking input from command line parameters
                url = self.base_url % (self.first, self.last)
                yield self.make_requests_from_url(url)

    def parse(self, response): . . .

推荐答案

process.crawl 方法上传递蜘蛛参数:

pass the spider arguments on the process.crawl method:

process.crawl(spider, input='inputargument', first='James', last='Bond')

这篇关于在 Scrapy python 中将参数传递给 process.crawl的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆