成功时如何保留PIG作业日志文件。 [英] How to keep PIG job log file when it is successful.

查看:88
本文介绍了成功时如何保留PIG作业日志文件。的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我注意到,当运行PIG脚本时出现错误时,会生成并保存日志。但是当没有错误时,日志文件被删除。即使作业成功,是否有办法保留日志文件?

解决方案

默认错误(例如:脚本解析错误)记录到 pig.logfile ,它可以在 $ PIG_HOME / conf / pig.properties 中设置。如果您还想记录状态消息,则准备一个有效的 log4j.properties 文件并将其设置在 log4jconf 属性中。
$ b

例如:将log4j.properties.template重命名为$ PIG_HOME / conf中的log4j.properties,并设置以下内容:

  log4j.logger.org.apache.pig = info,B 

#***** A被设置为ConsoleAppender。
#log4j.appender.A = org.apache.log4j.ConsoleAppender
#***** A使用PatternLayout。
#log4j.appender.A.layout = org.apache.log4j.PatternLayout
#log4j.appender.A.layout.ConversionPattern =% - 4r [%t]%-5p%c%x - %m%n

#***** B设置为FileAppender。
log4j.appender.B = org.apache.log4j.FileAppender
#log4j.appender.B.File = / home / user / pig-distrib / logs / pig_success.log
log4j。 appender.B.File = / home / user / pig-distrib / logs / pig.log
log4j.appender.B.layout = org.apache.log4j.PatternLayout
log4j.appender.B.layout .ConversionPattern =% - 4r [%t]%-5p%c%x - %m%n
log4j.appender.B.Append = true



当使用Pig v0.10.0(r1328203)时,我发现成功的pig任务不会将作业的历史日志写入输出目录hdfs。

hadoop.job.history.user.location = $ {mapred.output.dir} / _ logs / history /



如果您想要这些历史记录,可以通过这种方式在您的猪脚本中设置mapred.output.dir:

  set mapred.output.dir'/ user / hadoop / test / output'; 


I noticed that when there is an error in running a PIG script, a log is generated and kept. But when there is no error, the log file is removed. Is there a way to keep the log file even when the job is successful?

解决方案

By default errors (e.g: script parsing errors) are logged to pig.logfile which can be set in $PIG_HOME/conf/pig.properties. If you want to log status messages too, then prepare a valid log4j.properties file and set it in the log4jconf property.

E.g: rename log4j.properties.template to log4j.properties in $PIG_HOME/conf and set the followings:

log4j.logger.org.apache.pig=info, B

# ***** A is set to be a ConsoleAppender.
#log4j.appender.A=org.apache.log4j.ConsoleAppender
# ***** A uses PatternLayout.
#log4j.appender.A.layout=org.apache.log4j.PatternLayout
#log4j.appender.A.layout.ConversionPattern=%-4r [%t] %-5p %c %x - %m%n

# ***** B is set to be a FileAppender.
log4j.appender.B=org.apache.log4j.FileAppender
#log4j.appender.B.File=/home/user/pig-distrib/logs/pig_success.log
log4j.appender.B.File=/home/user/pig-distrib/logs/pig.log
log4j.appender.B.layout=org.apache.log4j.PatternLayout
log4j.appender.B.layout.ConversionPattern=%-4r [%t] %-5p %c %x - %m%n
log4j.appender.B.Append=true


When using Pig v0.10.0 (r1328203) I found that a successful pig task doesn't write the job's history logs to the output directory on hdfs.
(hadoop.job.history.user.location=${mapred.output.dir}/_logs/history/)

If you want to have these histories by all means then set mapred.output.dir in your pig script in this way:

set mapred.output.dir '/user/hadoop/test/output';

这篇关于成功时如何保留PIG作业日志文件。的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆