Logger 在集群上的 spark UDF 内不起作用 [英] Logger is not working inside spark UDF on cluster
本文介绍了Logger 在集群上的 spark UDF 内不起作用的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!
问题描述
我已经在我的 UDF 中放置了 log.info 语句,但它在集群上失败了.本地工作正常.这是片段:
I have placed log.info statements inside my UDF but it is getting failed on cluster. Local working fine. Here is the snippet:
def relType = udf((colValue: String, relTypeV: String) => {
var relValue = "NA"
val relType = relTypeV.split(",").toList
val relTypeMap = relType.map { col =>
val split = col.split(":")
(split(0), split(1))
}.toMap
// val keySet = relTypeMap
relTypeMap.foreach { x =>
if ((x._1 != null || colValue != null || x._1.trim() != "" || colValue.trim() != "") && colValue.equalsIgnoreCase(x._1)) {
relValue = relTypeMap.getOrElse(x._1, "NA")
log.info("testing.........")
}
}
relValue
})
此外,当我在 UDF 中调用任何函数并使用日志语句时,日志不会打印在集群中,它也可以正常工作.
Also, when i am calling any function inside UDF and using log statements, logs are not printed in cluster and it works fine too.
推荐答案
log4j.appender.myConsoleAppender=org.apache.log4j.ConsoleAppender
log4j.appender.myConsoleAppender.layout=org.apache.log4j.PatternLayout
log4j.appender.myConsoleAppender.layout.ConversionPattern=%d{yyyy/MM/dd HH:mm:ss} %p %c{1}: %m%n
log4j.appender.RollingAppender=org.apache.log4j.DailyRollingFileAppender
log4j.appender.RollingAppender.File=src//main//resources//spark.log
log4j.appender.RollingAppender.DatePattern='.'yyyy-MM-dd
log4j.appender.RollingAppender.layout=org.apache.log4j.PatternLayout
log4j.appender.RollingAppender.layout.ConversionPattern=%d{yyyy/MM/dd HH:mm:ss} %p %c{1}: %m%n
log4j.appender.RollingAppenderU=org.apache.log4j.DailyRollingFileAppender
log4j.appender.RollingAppenderU.File=src//main//resources//sparkU.log
log4j.appender.RollingAppenderU.DatePattern='.'yyyy-MM-dd
log4j.appender.RollingAppenderU.layout=org.apache.log4j.PatternLayout
log4j.appender.RollingAppenderU.layout.ConversionPattern=%d{yyyy/MM/dd HH:mm:ss} %p %c{1}: %m%n
# By default, everything goes to console and file
log4j.rootLogger=INFO, RollingAppender, myConsoleAppender
# My custom logging goes to another file
log4j.logger.myLogger=INFO, RollingAppenderU
# The noisier spark logs go to file only
log4j.logger.spark.storage=INFO, RollingAppender
log4j.additivity.spark.storage=false
log4j.logger.spark.scheduler=INFO, RollingAppender
log4j.additivity.spark.scheduler=false
log4j.logger.spark.CacheTracker=INFO, RollingAppender
log4j.additivity.spark.CacheTracker=false
log4j.logger.spark.CacheTrackerActor=INFO, RollingAppender
log4j.additivity.spark.CacheTrackerActor=false
log4j.logger.spark.MapOutputTrackerActor=INFO, RollingAppender
log4j.additivity.spark.MapOutputTrackerActor=false
log4j.logger.spark.MapOutputTracker=INFO, RollingAppender
log4j.additivty.spark.MapOutputTracker=false
这篇关于Logger 在集群上的 spark UDF 内不起作用的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!
查看全文