如何保存 Scrapy crawl 命令输出 [英] How to save Scrapy crawl Command output
问题描述
我正在尝试保存我尝试过的 scrapy crawl 命令的输出scrapy crawl someSpider -o some.json -t json >>some.text
但它不起作用......有人可以告诉我如何将输出保存到文本文件......我的意思是scrapy打印的日志和信息......
I am trying to save the output of the scrapy crawl command I have tried
scrapy crawl someSpider -o some.json -t json >> some.text
But it doesn't worked ...can some body tell me how i can save output to a text file....I mean the logs and information printed by scrapy...
推荐答案
您也需要重定向 stderr.您仅重定向标准输出.你可以像这样重定向它:
You need to redirect stderr too. You are redirecting only stdout. You can redirect it somehow like this:
scrapy crawl someSpider -o some.json -t json 2>some.text
关键是数字 2,它选择"stderr 作为重定向源.
The key is number 2, which "selects" stderr as source for redirection.
如果您想将 stderr 和 stdout 都重定向到一个文件中,您可以使用:
If you would like to redirect both stderr and stdout into one file, you can use:
scrapy crawl someSpider -o some.json -t json &>some.text
有关输出重定向的更多信息:http://tldp.org/HOWTO/Bash-Prog-Intro-HOWTO-3.html
For more about output redirection: http://tldp.org/HOWTO/Bash-Prog-Intro-HOWTO-3.html
这篇关于如何保存 Scrapy crawl 命令输出的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!