将Log4j输出写入HDFS [英] Write Log4j output to HDFS

查看:90
本文介绍了将Log4j输出写入HDFS的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

有没有人试图直接将 log4j 日志文件 写入 Hadoop分布式文件系统



如果是,请回复如何实现此目的。
我想我必须为它创建一个Appender。



这是这样吗?
我的必要性是以特定的时间间隔将日志写入文件并在稍后阶段查询该数据。 我建议为此任务使用 Apache Flume 。 Log4j有 Flume appender 。这样,您可以将日志发送到Flume,并写入HDFS。这种方法的好处在于Flume与HDFS成为单一的通信点。 Flume可以轻松添加新的数据源,而无需一次又一次编写与HDFS交互的代码。

Has anyone tried to write log4j log file directly to Hadoop Distributed File System ?

If yes, please reply how to achieve this. I think I will have to create an Appender for it.

Is this the way? My necessity is to write logs to a file at particular intervals and query that data at a later stage.

解决方案

I recommend to use Apache Flume for this task. There is Flume appender for Log4j. This way, you send logs to Flume, and it writes to HDFS. Good thing about this approach is that Flume becomes single point of communication with HDFS. Flume makes it easy to add new data sources without writing bunch of code for interaction with HDFS again and again.

这篇关于将Log4j输出写入HDFS的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆