如何在单元测试中禁止Spark日志记录? [英] How to suppress Spark logging in unit tests?
本文介绍了如何在单元测试中禁止Spark日志记录?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!
问题描述
因此,由于我尝试了易于搜索的博客,所以尝试了
import org.specs2.mutable.Specification
class SparkEngineSpecs extends Specification {
sequential
def setLogLevels(level: Level, loggers: Seq[String]): Map[String, Level] = loggers.map(loggerName => {
val logger = Logger.getLogger(loggerName)
val prevLevel = logger.getLevel
logger.setLevel(level)
loggerName -> prevLevel
}).toMap
setLogLevels(Level.WARN, Seq("spark", "org.eclipse.jetty", "akka"))
val sc = new SparkContext(new SparkConf().setMaster("local").setAppName("Test Spark Engine"))
// ... my unit tests
但是不幸的是,它不起作用,我仍然得到很多火花输出,例如:
14/12/02 12:01:56 INFO MemoryStore: Block broadcast_4 of size 4184 dropped from memory (free 583461216)
14/12/02 12:01:56 INFO ContextCleaner: Cleaned broadcast 4
14/12/02 12:01:56 INFO ContextCleaner: Cleaned shuffle 4
14/12/02 12:01:56 INFO ShuffleBlockManager: Deleted all files for shuffle 4
解决方案
将以下代码添加到src/test/resources
目录内的log4j.properties
文件中,如果不存在则创建文件/目录
# Change this to set Spark log level
log4j.logger.org.apache.spark=WARN
# Silence akka remoting
log4j.logger.Remoting=WARN
# Ignore messages below warning level from Jetty, because it's a bit verbose
log4j.logger.org.eclipse.jetty=WARN
当我运行单元测试(我正在使用JUnit和Maven)时,我只收到WARN级别的日志,换句话说,不再被INFO级别的日志弄得一团糟(尽管它们有时在调试时很有用).>
我希望这会有所帮助.
So thanks to easily googleable blogs I tried:
import org.specs2.mutable.Specification
class SparkEngineSpecs extends Specification {
sequential
def setLogLevels(level: Level, loggers: Seq[String]): Map[String, Level] = loggers.map(loggerName => {
val logger = Logger.getLogger(loggerName)
val prevLevel = logger.getLevel
logger.setLevel(level)
loggerName -> prevLevel
}).toMap
setLogLevels(Level.WARN, Seq("spark", "org.eclipse.jetty", "akka"))
val sc = new SparkContext(new SparkConf().setMaster("local").setAppName("Test Spark Engine"))
// ... my unit tests
But unfortunately it doesn't work, I still get a lot of spark output, e.g.:
14/12/02 12:01:56 INFO MemoryStore: Block broadcast_4 of size 4184 dropped from memory (free 583461216)
14/12/02 12:01:56 INFO ContextCleaner: Cleaned broadcast 4
14/12/02 12:01:56 INFO ContextCleaner: Cleaned shuffle 4
14/12/02 12:01:56 INFO ShuffleBlockManager: Deleted all files for shuffle 4
解决方案
Add the following code into the log4j.properties
file inside the src/test/resources
dir, create the file/dir if not exist
# Change this to set Spark log level
log4j.logger.org.apache.spark=WARN
# Silence akka remoting
log4j.logger.Remoting=WARN
# Ignore messages below warning level from Jetty, because it's a bit verbose
log4j.logger.org.eclipse.jetty=WARN
When I run my unit tests (I'm using JUnit and Maven), I only receive WARN level logs, in other words no more cluttering with INFO level logs (though they can be useful at times for debugging).
I hope this helps.
这篇关于如何在单元测试中禁止Spark日志记录?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!
查看全文