如何将多个参数传递给 Scrapy 蜘蛛(不再支持使用多个蜘蛛运行“scrapy crawl"时出错)? [英] How to pass multiple arguments to Scrapy spider (getting error running 'scrapy crawl' with more than one spider is no longer supported)?
问题描述
我想将多个用户定义的参数传递给我的scrapy spyder,所以我尝试关注这篇文章:如何在scrapy蜘蛛中传递用户定义的参数
I would like to pass multiple user-defined arguments to my scrapy spyder, so I tried to follow this post: How to pass a user defined argument in scrapy spider
但是,当我按照那里的建议进行操作时,出现错误:
However, when I follow the advice there I get an error:
root@ scrapy crawl dmoz -a address= 40-18 48th st -a borough=4
Usage
=====
scrapy crawl [options] <spider>
crawl: error: running 'scrapy crawl' with more than one spider is no longer supported
我也尝试过各种引号的排列:
I also tried with various permutations of quotation marks:
root@ scrapy crawl dmoz -a address= "40-18 48th st" -a borough="4"
Usage
=====
scrapy crawl [options] <spider>
crawl: error: running 'scrapy crawl' with more than one spider is no longer supported
将参数传递给 Scrapy 蜘蛛的正确方法是什么?我想为蜘蛛的登录/抓取过程传递一个用户名和密码.感谢您的任何建议.
What is the correct way to pass parameters to the Scrapy spider? I would like to pass a username and password for the spider's login/scraping process. Thanks for any suggestions.
推荐答案
没有 scrapy
问题,我猜.这是您的 shell
解释输入的方式,在空格中分割标记.所以,你不能在键和它的值之间有任何一个.尝试:
No scrapy
problem, I guess. It's how your shell
interprets input, spliting tokens in spaces. So, you must not have any of them between the key and its value. Try with:
scrapy crawl dmoz -a address="40-18 48th st" -a borough="4"
这篇关于如何将多个参数传递给 Scrapy 蜘蛛(不再支持使用多个蜘蛛运行“scrapy crawl"时出错)?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!