覆盖Spark log4j配置 [英] Override Spark log4j configurations

查看:669
本文介绍了覆盖Spark log4j配置的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在Yarn群集上运行Spark,并配置了log4j.properties,以使默认情况下所有日志都进入日志文件.但是,对于某些Spark作业,我希望日志进入控制台而不更改log4j文件和实际作业的代码.实现此目标的最佳方法是什么?谢谢,一切.

I'm running Spark on a Yarn cluster and having log4j.properties configured such that all logs by default go to a log file. However, for some spark jobs I want the logs to go to console without changing the log4j file and the code of the actual job. What is the best way to achieve this? Thanks, all.

推荐答案

我知道至少有4种解决方案可以解决此问题.

I know there have at least 4 solutions for solving this problem.

  1. 您可以在Spark机器中修改log4j.properties

  1. You could modify your log4j.properties in your Spark machines

在spark上运行作业时,最好将log4j文件作为配置文件附加到spark示例

When you running the job on spark you better to attach the log4j file as configuration file submit to spark example

bin/spark-submit --class com.viaplay.log4jtest.log4jtest --conf"spark.driver.extraJavaOptions = -Dlog4j.configuration = file:/Users/feng/SparkLog4j/SparkLog4jTest/target/log4j2.properties" --master local [*]/Users/feng/SparkLog4j/SparkLog4jTest/target/SparkLog4jTest-1.0-jar-with-dependencies.jar

bin/spark-submit --class com.viaplay.log4jtest.log4jtest --conf "spark.driver.extraJavaOptions=-Dlog4j.configuration=file:/Users/feng/SparkLog4j/SparkLog4jTest/target/log4j2.properties" --master local[*] /Users/feng/SparkLog4j/SparkLog4jTest/target/SparkLog4jTest-1.0-jar-with-dependencies.jar

尝试将log4j导入您的逻辑代码.

Try to import log4j to your logic code.

导入org.apache.log4j.Logger; 导入org.apache.log4j.Level;

import org.apache.log4j.Logger; import org.apache.log4j.Level;

将那些记录器放入您的SparkContext()函数 Logger.getLogger("org").setLevel(Level.INFO); Logger.getLogger("akka").setLevel(Level.INFO);

put those logger to your SparkContext() function Logger.getLogger("org").setLevel(Level.INFO); Logger.getLogger("akka").setLevel(Level.INFO);

使用Spark.sql.SparkSession火花

Spark use spark.sql.SparkSession

导入org.apache.spark.sql.SparkSession; spark = SparkSession.builder.getOrCreate() spark.sparkContext.setLogLevel('ERROR')

import org.apache.spark.sql.SparkSession; spark = SparkSession.builder.getOrCreate() spark.sparkContext.setLogLevel('ERROR')

这篇关于覆盖Spark log4j配置的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆