更改运行蜘蛛的数量scrapyd [英] Change number of running spiders scrapyd
问题描述
嘿,我的项目中有大约 50 个蜘蛛,我目前正在通过 scrapyd 服务器运行它们.我遇到了一个问题,我使用的一些资源被锁定,使我的蜘蛛失败或变得非常慢.我希望他们能通过某种方式告诉 scrapyd 一次只有 1 个正在运行的蜘蛛,而将其余的留在待处理的队列中.我在文档中没有看到这个配置选项.任何帮助将不胜感激!
Hey so I have about 50 spiders in my project and I'm currently running them via scrapyd server. I'm running into an issue where some of the resources I use get locked and make my spiders fail or go really slow. I was hoping their was some way to tell scrapyd to only have 1 running spider at a time and leave the rest in the pending queue. I didn't see a configuration option for this in the docs. Any help would be much appreciated!
推荐答案
这可以通过 scrapyd 设置.将 max_proc
设置为 1
:
max_proc
将启动的最大并发 Scrapy 进程数.
The maximum number of concurrent Scrapy process that will be started.
这篇关于更改运行蜘蛛的数量scrapyd的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!