Scala:使用log4j将日志写入文件 [英] Scala: Write log to file with log4j

查看:398
本文介绍了Scala:使用log4j将日志写入文件的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试在eclipse中构建一个基于scala的jar文件,它使用log4j来创建日志。它在控制台中完美打印出来,但当我尝试使用log4j.properties文件将其写入日志文件时,没有任何反应。



项目结构如下





loggerTest.scala

  package scala.n *****。n **** 

import org.apache.log4j.Logger

object loggerTest extends LogHelper {
def main(args:Array [String]){
log.info(This is info);
log.error(这是错误);
log.warn(这是警告);
}
}

trait LogHelper {
lazy val log = Logger.getLogger(this.getClass.getName)
}

log4j.properties

  #Root logger选项
log4j.rootLogger = WARN,stdout,file

#将日志消息重定向到控制台
log4j.appender.stdout = org.apache.log4j.ConsoleAppender
log4j.appender.stdout.Target = System.out
log4j.appender.stdout.layout = org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern =%d {yyyy -MM-dd HH:mm:ss}%-5p%c {1}:%L - %m%n

#将日志消息重定向到日志文件
log4j.appender。 file = org.apache.log4j.RollingFileAppender
log4j.appender.file.File = / home / abc / test / abc.log
log4j.appender.file.encoding = UTF-8
log4j.appender.file.MaxFileSize = 2kB
log4j.appender.file.MaxBackupIndex = 1
log4j.appender.file.layout = org.apache.log4j.PatternLayout
log4j.appender.file .layout.ConversionPattern =%d {yyyy-MM-dd HH:mm:ss}%-5p%c {1 }:%L - %m%n

pom.xml

 < project xmlns =http://maven.apache.org/POM/4.0.0xmlns:xsi =http://www.w3。 org / 2001 / XMLSchema-instancexsi:schemaLocation =http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd\"> 
< modelVersion> 4.0.0< / modelVersion>
< groupId> loggerTest< / groupId>
< artifactId> loggerTest< / artifactId>
< version> 0.0.1-SNAPSHOT< / version>
< name> loggerTest< / name>
< description> loggerTest< / description>

< repositories>
< repository>
< id> scala-tools.org< / id>
< name> Scala-tools Maven2存储库< / name>
< url> http://scala-tools.org/repo-releases< / url>
< / repository>
< repository>
< id> maven-hadoop< / id>
< name> Hadoop版本< / name>
< url> https://repository.cloudera.com/content/repositories/releases/< / url>
< / repository>
< repository>
< id> cloudera-repos< / id>
< name> Cloudera Repos< / name>
< url> https://repository.cloudera.com/artifactory/cloudera-repos/< / url>
< / repository>
< / repositories>

< pluginRepositories>
< pluginRepository>
< id> scala-tools.org< / id>
< name> Scala-tools Maven2存储库< / name>
< url> http://scala-tools.org/repo-releases< / url>
< / pluginRepository>
< / pluginRepositories>

< properties>
< project.build.sourceEncoding> UTF-8< /project.build.sourceEncoding>
< project.reporting.outputEncoding> UTF-8< /project.reporting.outputEncoding>
< / properties>

< build>
< plugins>
<! - 混合scala / java编译 - >
< plugin>
< groupId> org.scala-tools< / groupId>
< artifactId> maven-scala-plugin< / artifactId>
< executions>
< execution>
< id> compile< / id>
< goals>
< goal> compile< / goal>
< / goals>
< phase> compile< / phase>
< / execution>
< execution>
< id> test-compile< / id>
< goals>
< goal> testCompile< / goal>
< / goals>
< phase> test-compile< / phase>
< / execution>
< execution>
< phase> process-resources< / phase>
< goals>
< goal> compile< / goal>
< / goals>
< / execution>
< / executions>
< / plugin>
< plugin>
< artifactId> maven-compiler-plugin< / artifactId>
< configuration>
< source> 1.7< / source>
< target> 1.7< / target>
< / configuration>
< / plugin>

< plugin>
< groupId> org.apache.maven.plugins< / groupId>
< artifactId> maven-jar-plugin< / artifactId>
< configuration>
< archive>
< manifest>
< addClasspath> true< / addClasspath>
< mainClass> fully.qualified.MainClass< / mainClass>
< / manifest>
< / archive>
< / configuration>
< / plugin>
< / plugins>
< pluginManagement>
< plugins>
<! - 此插件的配置仅用于存储Eclipse m2e设置
。它对Maven构建本身没有影响。 - >
< plugin>
< groupId> org.eclipse.m2e< / groupId>
< artifactId> lifecycle-mapping< / artifactId>
< version> 1.0.0< / version>
< configuration>
< lifecycleMappingMetadata>
< pluginExecutions>
< pluginExecution>
< pluginExecutionFilter>
< groupId> org.scala-tools< / groupId>
< artifactId>
maven-scala-plugin
< / artifactId>
< versionRange>
[2.15.2,)
< / versionRange>
< goals>
< goal> compile< / goal>
< goal> testCompile< / goal>
< / goals>
< / pluginExecutionFilter>
< action>
< execute>< / execute>
< / action>
< / pluginExecution>
< / pluginExecutions>
< / lifecycleMappingMetadata>
< / configuration>
< / plugin>
< / plugins>
< / pluginManagement>
< / build>

< dependencies>
< dependency>
< groupId> org.apache.spark< / groupId>


< artifactId> spark-core_2.10< / artifactId>
< version> 1.5.0< / version>
< / dependency>

<依赖>
< groupId> org.apache.spark< / groupId>
< artifactId> spark-hive_2.10< / artifactId>
< version> 1.5.0< / version>
< / dependency>

<依赖>
< groupId> org.apache.spark< / groupId>
< artifactId> spark-sql_2.10< / artifactId>
< version> 1.5.0< / version>
< / dependency>
< dependency>
< groupId> org.scala-lang< / groupId>
< artifactId> scala-library< / artifactId>
< version> 2.10.4< / version>
< / dependency>
< / dependencies>

< / project>

当我将其作为maven构建运行时,它会在target文件夹中生成一个jar文件。 / p>

我将jar复制到/ home / abc / test



我用命令<在spark shell中运行jar / p>

  $ spark-submit --class scala.n *****。n *****。loggerTest loggerTest-0.0 .1-SNAPSHOT.jar 

控制台出来没问题,但它应该写入/ home中的文件/ abc / log它没有。



有人可以帮忙吗?

解决方案

您在部署应用程序时遇到问题,您应该为执行程序和驱动程序定义log4j文件,如下所示

  spark-submit  - -class MAIN_CLASS --driver-java-options-Dlog4j.configuration = file:PATH_OF_LOG4J--confspark.executor.extraJavaOptions = -Dlog4j.configuration = file:PATH_OF_LOG4J--master MASTER_IP:PORT JAR_PATH 

有关详细信息和分步解决方案,您可以检查k此链接 https://blog.knoldus .com / 2016/02/23 / logging-spark-application-on-standalone-cluster /


I am trying to build a scala based jar file in eclipse that uses log4j to make logs. It prints out perfectly in the console but when I try to use log4j.properties file to make it write to a log file, nothing happens.

The project structure is as follows

loggerTest.scala

package scala.n*****.n****

import org.apache.log4j.Logger

object loggerTest extends LogHelper {
  def main(args : Array[String]){
    log.info("This is info");
    log.error("This is error");
    log.warn("This is warn");
  }
}

trait LogHelper {
  lazy val log = Logger.getLogger(this.getClass.getName)
}

log4j.properties

# Root logger option
log4j.rootLogger=WARN, stdout, file

# Redirect log messages to console
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.Target=System.out
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L - %m%n

# Redirect log messages to a log file
log4j.appender.file=org.apache.log4j.RollingFileAppender
log4j.appender.file.File=/home/abc/test/abc.log
log4j.appender.file.encoding=UTF-8
log4j.appender.file.MaxFileSize=2kB
log4j.appender.file.MaxBackupIndex=1
log4j.appender.file.layout=org.apache.log4j.PatternLayout
log4j.appender.file.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L - %m%n

pom.xml

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>loggerTest</groupId>
<artifactId>loggerTest</artifactId>
<version>0.0.1-SNAPSHOT</version>
<name>loggerTest</name>
<description>loggerTest</description>

<repositories>
  <repository>
    <id>scala-tools.org</id>
    <name>Scala-tools Maven2 Repository</name>
    <url>http://scala-tools.org/repo-releases</url>
  </repository>
  <repository>
    <id>maven-hadoop</id>
    <name>Hadoop Releases</name>
    <url>https://repository.cloudera.com/content/repositories/releases/</url>
  </repository>
  <repository>
    <id>cloudera-repos</id>
    <name>Cloudera Repos</name>
    <url>https://repository.cloudera.com/artifactory/cloudera-repos/</url>
  </repository>
</repositories>

<pluginRepositories>
  <pluginRepository>
    <id>scala-tools.org</id>
    <name>Scala-tools Maven2 Repository</name>
    <url>http://scala-tools.org/repo-releases</url>
  </pluginRepository>
</pluginRepositories>

<properties>
  <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
  <project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding>
</properties>

<build>
    <plugins>
        <!-- mixed scala/java compile -->
        <plugin>
            <groupId>org.scala-tools</groupId>
            <artifactId>maven-scala-plugin</artifactId>
            <executions>
                <execution>
                    <id>compile</id>
                    <goals>
                        <goal>compile</goal>
                    </goals>
                    <phase>compile</phase>
                </execution>
                <execution>
                    <id>test-compile</id>
                    <goals>
                        <goal>testCompile</goal>
                    </goals>
                    <phase>test-compile</phase>
                </execution>
                <execution>
                    <phase>process-resources</phase>
                    <goals>
                        <goal>compile</goal>
                    </goals>
                </execution>
            </executions>
        </plugin>
        <plugin>
            <artifactId>maven-compiler-plugin</artifactId>
            <configuration>
                <source>1.7</source>
                <target>1.7</target>
            </configuration>
        </plugin>

        <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-jar-plugin</artifactId>
            <configuration>
                <archive>
                    <manifest>
                        <addClasspath>true</addClasspath>
                        <mainClass>fully.qualified.MainClass</mainClass>
                    </manifest>
                </archive>
            </configuration>
        </plugin>
    </plugins>
    <pluginManagement>
        <plugins>
            <!--This plugin's configuration is used to store Eclipse m2e settings 
                only. It has no influence on the Maven build itself. -->
            <plugin>
                <groupId>org.eclipse.m2e</groupId>
                <artifactId>lifecycle-mapping</artifactId>
                <version>1.0.0</version>
                <configuration>
                    <lifecycleMappingMetadata>
                        <pluginExecutions>
                            <pluginExecution>
                                <pluginExecutionFilter>
                                    <groupId>org.scala-tools</groupId>
                                    <artifactId>
                                        maven-scala-plugin
                                    </artifactId>
                                    <versionRange>
                                        [2.15.2,)
                                    </versionRange>
                                    <goals>
                                        <goal>compile</goal>
                                        <goal>testCompile</goal>
                                    </goals>
                                </pluginExecutionFilter>
                                <action>
                                    <execute></execute>
                                </action>
                            </pluginExecution>
                        </pluginExecutions>
                    </lifecycleMappingMetadata>
                </configuration>
            </plugin>
        </plugins>
    </pluginManagement>
</build>

<dependencies>
 <dependency>
<groupId>org.apache.spark</groupId>


<artifactId>spark-core_2.10</artifactId>
    <version>1.5.0</version>
   </dependency>

   <dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-hive_2.10</artifactId>
    <version>1.5.0</version>
  </dependency>

  <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-sql_2.10</artifactId>
        <version>1.5.0</version>
    </dependency>
      <dependency>
          <groupId>org.scala-lang</groupId>
          <artifactId>scala-library</artifactId>
          <version>2.10.4</version>
        </dependency> 
      </dependencies>

    </project>

When I run it as a maven build, it generates a jar file in "target" folder.

I copy the jar to /home/abc/test

I run that jar in spark shell with command

   $ spark-submit --class scala.n*****.n*****.loggerTest loggerTest-0.0.1-SNAPSHOT.jar

The console come out alright but it should write to a file in /home/abc/log which it does not.

Could someone please help?

解决方案

Hello while you are deploying you application you should define log4j file for executor and driver as follows

spark-submit --class MAIN_CLASS --driver-java-options "-Dlog4j.configuration=file:PATH_OF_LOG4J" --conf "spark.executor.extraJavaOptions=-Dlog4j.configuration=file:PATH_OF_LOG4J" --master MASTER_IP:PORT JAR_PATH

For more details and step by step solution you can check this link https://blog.knoldus.com/2016/02/23/logging-spark-application-on-standalone-cluster/

这篇关于Scala:使用log4j将日志写入文件的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆