Apache的火花单独的日志中 [英] Separate logs from Apache spark

查看:105
本文介绍了Apache的火花单独的日志中的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想有工人,硕士和单独的工作日志文件(遗嘱执行人,提交,不知道该怎么称呼它)。我试图在配置 log4j.properties

I would like to have separate log files from workers, masters and jobs(executors, submits, don't know how call it). I tried configuration in log4j.properties like

log4j.appender.myAppender.File=/some/log/dir/${log4j.myAppender.FileName}

和不是通过 log4j.myAppender.FileName SPARK_MASTER_OPTS SPARK_WORKER_OPTS spark.executor.extraJavaOptions spark.driver.extraJavaOptions

and than passing log4j.myAppender.FileName in SPARK_MASTER_OPTS, SPARK_WORKER_OPTS, spark.executor.extraJavaOptions and spark.driver.extraJavaOptions.

它工作得很好,同工人大师,但失败,执行者和司机。下面是例子我如何使用这些:

It works perfectly well with workers and masters but fails with executors and drivers. Here is example of how I use these:

./spark-submit ... --conf "\"spark.executor.extraJavaOptions=log4j.myAppender.FileName=myFileName some.other.option=foo\"" ...

我也试过把 log4j.myAppender.FileName 一些默认值火花defaults.conf ,但它不工作也没有。

I also tried putting log4j.myAppender.FileName with some default value in spark-defaults.conf but it doesn't work neither.

有一些方法可以达到我想要什么?

Is there some way to achieve what I want?

推荐答案

日志记录执行人和驱动程序可以通过 CONF /火花defaults.conf 通过增加这些条目进行配置(从我的Windows配置)

Logging for Executors and Drivers can be configured by conf/spark-defaults.conf by adding these entries (from my windows config)

spark.driver.extraJavaOptions  -Dlog4j.configuration=file:C:/dev/programs/spark-1.2.0/conf/log4j-driver.properties
spark.executor.extraJavaOptions  -Dlog4j.configuration=file:C:/dev/programs/spark-1.2.0/conf/log4j-executor.properties

请注意,上面的每个条目引用不同的 log4j.properties 文件,这样就可以独立配置。

Note that each entry above references a different log4j.properties file so you can configure them independently.

这篇关于Apache的火花单独的日志中的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆