如何保持Logstash运行,以便将数据从RDBMS同步到ES? [英] How do I keep Logstash running so it syncs data from my RDBMS to ES?

查看:159
本文介绍了如何保持Logstash运行,以便将数据从RDBMS同步到ES?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我是ELK堆栈的新手,所以请原谅我的无知.我已经能够让Logstash将数据从我的数据库发送到Elasticsearch,但是一旦完成传输,它就会退出.我如何使其保持运行状态,以使其保持同步?谢谢

I'm a complete newbie to the ELK stack, so please excuse my ignorance. I've been able to get Logstash to send data from my database to Elasticsearch, but it exits once it's done with the transfer. How do I keep it running so it keeps them in sync? Thanks

推荐答案

您需要指定

You need to specify a schedule in your jdbc input:

下面的schedule(* * * * *)将每分钟运行一次,并从数据库中选择记录,并且仅选择上次运行查询后已更新的记录.您的updated时间戳字段名称可能有所不同,可以根据情况进行调整.

The schedule below (* * * * *) will run every minute and select records from your database and only select the records that have been updated after the last time the query ran. Your updated timestamp field might be named differently, feel free to adjust to fit your case.

input {
  jdbc {
    jdbc_driver_library => "mysql-connector-java-5.1.36-bin.jar"
    jdbc_driver_class => "com.mysql.jdbc.Driver"
    jdbc_connection_string => "jdbc:mysql://localhost:3306/mydb"
    jdbc_user => "mysql"
    parameters => { "some_field" => "value" }
    schedule => "* * * * *"
    statement => "SELECT * from songs WHERE some_field = :some_field AND updated > :sql_last_value"
  }
}

这篇关于如何保持Logstash运行,以便将数据从RDBMS同步到ES?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆