Scrapy:如何通过命令提示符将参数列表传递给蜘蛛? [英] Scrapy : How to pass list of arguments through command prompt to spider?
问题描述
为幻想团队创建一个抓取工具.寻找一种方法将玩家姓名列表作为参数传递,然后为 player_list 中的每个 player_name 运行解析代码.
Creating a scraper for fantasy team. Looking for a way to pass a list of the players names as arguments, and then for each player_name in player_list run the parsing code.
我目前有这样的东西
class statsspider(BaseSpider):
name = 'statsspider'
def __init__ (self, domain=None, player_list=""):
self.allowed_domains = ['sports.yahoo.com']
self.start_urls = [
'http://sports.yahoo.com/nba/players',
]
self.player_list= "%s" % player_list
def parse(self, response):
example code
yield request
我假设输入参数列表与通过命令行输入一个参数相同,所以我输入如下内容:
I'm assuming entering a list of arguments is the same as just one argument through the command line so I enter something like this:
scrapy crawl statsspider -a player_list=['xyz','abc']
<小时>
问题 2!
通过输入逗号分隔的参数列表解决了第一个问题
Solved the first issue by inputting a comma delimited list of arguments like so
scrapy crawl statsspider -a player_list="abc def,ghi jkl"
我现在想通过每个姓名"(即abc def")找到他们姓氏的第一个首字母(在本例中为d").
I now want to go through each "name" (i.e. 'abc def') to find the first initial of their last name (in this case 'd').
我使用代码
array = []
for player_name in self.player_list:
array.append(player_name)
print array
我最终得到了结果 [["'",'a','b','c',... etc]] 为什么 python 不将 player_name 分配给每个 'name'(例如 'abc def' 和 'ghi jkl')?谁能给我解释一下这个逻辑,我以后可能会明白正确的做法!
And I end up with the result [["'",'a','b','c',... etc]] Why does python not assign player_name to each 'name' (e.g. 'abc def' and 'ghi jkl')? can someone explain this logic to me, and I will probably understand the right way to do it afterwards!
推荐答案
Shell 参数是基于字符串的.您需要在代码中解析 arg.
Shell arguments are string-based. You need to parse arg in your code.
命令行:
scrapy crawl statsspider -a player_list=xyz,abc
python 代码:
self.player_list = player_list.split(',')
这篇关于Scrapy:如何通过命令提示符将参数列表传递给蜘蛛?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!