如何管尾-f成AWK [英] How to pipe tail -f into awk

查看:105
本文介绍了如何管尾-f成AWK的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想在某个字符串出现在日志文件中设置其中生成警报的脚本。

I'm trying to set up a script where an alert is generated when a certain string appears in a log file.

已经到位解决方案里grep整个日志文件每分钟一次和计数串出现的频率,使用日志行的时间戳来算只有在previous分钟出现。

The solution already in place greps the whole log file once a minute and counts how often the string appears, using the log line's timestamp to count only occurrences in the previous minute.

我想这将是更有效的带尾巴要做到这一点,所以我尝试以下,作为一个测试:

I figured it would be much more efficient to do this with a tail, so I tried the following, as a test:

FILENAME="/var/log/file.log"

tail -f $FILENAME | awk -F , -v var="$HOSTNAME" '
                BEGIN {
                        failed_count=0;
                }
                /account failure reason/ {
                        failed_count++;
                }
                END {
                        printf("%saccount failure reason (Errors per Interval)=%d\n", var, failed_count);
                }
'

不过这只是挂起,不输出任何东西。有人认为这种微小的变化:

but this just hangs and doesn't output anything. Somebody suggested this minor change:

FILENAME="/var/log/file.log"

awk -F , -v var="$HOSTNAME" '
                BEGIN {
                        failed_count=0;
                }
                /account failure reason/ {
                        failed_count++;
                }
                END {
                        printf("%saccount failure reason (Errors per Interval)=%d\n", var, failed_count);
                }
' <(tail -f $FILENAME)

但做同样的事情。

but that does the same thing.

我使用的(我在上面的code简体)awk的工作,因为它在的的grep^ $ TIMESTAMP的结果的管道是在现有脚本中使用进去了。

The awk I'm using (I've simplified in the code above) works, as it's used in the existing script where the results of grep "^$TIMESTAMP" are piped into it.

我的问题是,怎样才能得到尾-f与AWK工作?

My question is, how can get the tail -f to work with awk?

推荐答案

假设你的日志看起来是这样的:

Assuming your log looks something like this:

Jul 13 06:43:18 foo account failure reason: unknown
 │   │    
 │   └── $2 in awk
 └────── $1 in awk

你可以做这样的事情:

you could do something like this:

FILENAME="/var/log/file.log"

tail -F $FILENAME | awk -v hostname="$HOSTNAME" '
    NR == 1 {
        last=$1 " " $2;
    }
    $1 " " $2 != last {
        printf("%s account failure reason (Errors on %s)=%d\n", hostname, last, failed);
        last=$1 " " $2;
        failed=0;
    }
    /account failure reason/ {
        failed++;
    }
'

请注意,因为它处理日志老化,我已经改变了这一为尾-F (大写F)。这不是在每个操作系统的支持,但应该在现代的BSD和Linuces工作

Note that I've changed this to tail -F (capital F) because it handles log aging. This isn't supported in every operating system, but it should work in modern BSDs and Linuces.

这是如何工作的?

awk脚本包括套测试{命令; } 评估对输入的每一行。 (有两个特殊的测试, BEGIN END 的命令中运行时的awk开始时分别AWK结束,在你的问题,AWK永不落幕的,所以 END code从未运行)。

Awk scripts consist of sets of test { commands; } evaluated against each line of input. (There are two special tests, BEGIN and END whose commands run when awk starts and when awk ends, respectively. In your question, awk never ended, so the END code was never run.)

上面的脚本有三个测试/命令部分:

The script above has three of test/command sections:


  • 在第一, NR == 1 是仅输入的第一行评估真正的考验。它运行该命令为末页变量的初始值,在下一节中使用。

  • 在第二部分,我们测试了最后一个变量是否已经因为这是评估的最后一行改变。如果这是真的,这表明我们正在评估一个新的一天的数据。现在是时候来打印上个月的总结(日志),重置我们的变量,继续前进。

  • 在第三,如果我们正在评估匹配常规的前pression /帐户失败原因/ ,我们增加我们的柜台就行了。

  • In the first, NR == 1 is a test that evaluates true on only the first line of input. The command it runs creates the initial value for the last variable, used in the next section.
  • In the second section, we test whether the "last" variable has changed since the last line that was evaluated. If this is true, it indicates that we're evaluating a new day's data. Now it's time to print a summary (log) of last month, reset our variables and move on.
  • In the third, if the line we're evaluating matches the regular expression /account failure reason/, we increment our counter.

清除泥? : - )

这篇关于如何管尾-f成AWK的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆