登录Scrapy [英] Logging in Scrapy
问题描述
我已经设置了 LOG_FILE =在
和文档中,这应该可以工作: settings.py
文件中的log.txt
Scrapy在每个Spider实例中提供一个记录器,可以像这样访问和使用:
name ='myspider'
start_urls = [pre $
class MySpider(scrapy.Spider) 'http://scrapinghub.com']
def parse(self,response):
self.logger.info('%s'上的parse函数,response.url)
但是当我这样做:
class MySpider(CrawlSpider):
#另一个代码
def parse_page(self,response):
self.logger.info(foobar)
我什么也没收。如果我设置
logger = logging.basicConfig(filename =log.txt,level = logging.INFO)
在我的文件顶部,在导入之后,它创建一个日志文件,默认输出只记录好的,但是
class MySpider(CrawlSpider):
#other code
def parse_page(self,response )
logger.info(foobar)
无法出现。我也试过把它放在课上 __ init __
,如下所示:
def __init __(self,* a,** kw):
super(FanfictionSpider,self).__ init __(* a,** kw)
logging.basicConfig(filename =log.txt level = logging.INFO)
我再次无法输出文件,只是到控制台,和 foobar
不显示。有人可以指示我如何正确登录Scrapy?
对于日志记录我只是把它放在蜘蛛类上: / p>
class SomeSpider(scrapy.Spider):
configure_logging(install_root_handler = False)
logging.basicConfig b $ b filename ='log.txt',
format ='%(levelname)s:%(message)s',
level = logging.INFO
)
这将把所有scrapy输出作为一个 log.txt
file
如果要手动记录某些东西,您不应该使用scrapy logger,因此已被弃用。只需使用python一个
import logging
logging.error(Some error)
I am having trouble with logging in scrapy, and most of what I can find is out of date.
I have set LOG_FILE="log.txt"
in the settings.py
file and from the documentation, this should work:
Scrapy provides a logger within each Spider instance, that can be accessed and used like this:
import scrapy
class MySpider(scrapy.Spider):
name = 'myspider'
start_urls = ['http://scrapinghub.com']
def parse(self, response):
self.logger.info('Parse function called on %s', response.url)
But when I do:
class MySpider(CrawlSpider):
#other code
def parse_page(self,response):
self.logger.info("foobar")
I get nothing. If I set
logger = logging.basicConfig(filename="log.txt",level=logging.INFO)
At the top of my file, after my imports, it creates a log file, and the default output gets logged just fine, but
class MySpider(CrawlSpider):
#other code
def parse_page(self,response):
logger.info("foobar")
Fails to make an appearance. I have also tried putting it in the class __init__
, as such:
def __init__(self, *a, **kw):
super(FanfictionSpider, self).__init__(*a, **kw)
logging.basicConfig(filename="log.txt",level=logging.INFO)
I once again get no output to the file, just to the console, and foobar
does not show up. Can someone please direct me on how to correctly log in Scrapy?
For logging I just put this on the spider class:
class SomeSpider(scrapy.Spider):
configure_logging(install_root_handler=False)
logging.basicConfig(
filename='log.txt',
format='%(levelname)s: %(message)s',
level=logging.INFO
)
This will put all scrapy output into the project root directory as a log.txt
file
If you want to log something manually you shouldn't use the scrapy logger, it's deprecated. Just use the python one
import logging
logging.error("Some error")
这篇关于登录Scrapy的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!