使用Filebeat传输CSV文件 [英] Transport csv file with filebeat

查看:1321
本文介绍了使用Filebeat传输CSV文件的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

案例

从客户端PC推送csv文件到服务器端的弹性文件

松紧带已经安装好了.我可以从我的电脑访问它并使用演示数据.现在,我想学习如何使用自己的数据进行推送.我已经从kaggle准备了数据.

客户端

我已经在客户端下载了文件拍并提取了它. 我已将filebeat.yml编辑为

filebeat.inputs:
- input_type: log
  paths:
    - C:\Users\Charles\Desktop\DATA\BrentOilPrices.csv
document_type: test_log_csv
output.logstash:
  hosts: ["10.64.2.246:5044"]

我也用

对其进行了测试

./filebeat test config

返回: 确定配置

服务器端

将logstash.conf编辑为

input {
beats {
port =>5044
}
}

filter {

if "test_log_csv" in [type]
{
csv {
columns=>["Date","Price"]
separator=>","
}
mutate{
convert => ["Price","integer"]
}
date{
match=>["Date","d/MMM/yy"]
}
}
}

output {
if "test_log_csv" in [type]
{
elasticsearch
{
hosts=>"127.0.0.1:9200"
index=>"test_log_csv%{+d/MM/yy}"
}
}

客户端

我跑步

Start-Service filebeat

它什么也不返回.

我检查了我的kibana,没有日志. 我想念什么?

在客户端编辑了filebeat.yml

filebeat.inputs:
- input_type: log
  paths:
    - 'C:\Users\Charles\Desktop\DATA\BrentOilPrices.csv'
fields:
document_type: test_log_csv
output.logstash:
  hosts: ["10.64.2.246:5044"]

解决方案

document_type选项已从6.X版的Filebeat中删除,因此不再创建type字段,因为您的条件条件基于此字段,您的管道将无法正常工作.另外,即使在Windows上,也应尝试使用正斜杠(/).

尝试更改以下配置,然后再次测试.

filebeat.inputs:
- input_type: log
  paths:
    - 'C:/Users/Charles/Desktop/DATA/BrentOilPrices.csv'
fields:
  type: test_log_csv
fields_under_root: true
output.logstash:
  hosts: ["10.64.2.246:5044"]

选项fields_under_root: true将在文档的根目录中创建字段type,如果删除此选项,则该字段将被创建为[fields][type],并且您需要将条件转换为该字段. /p>

Case

push csv file from client PC to elastic on server side

the elastic have been installed, nicely. I can accessed it from my pc and use demo data. Now I would like to learn how to push it with my own data. I've prepared my data from kaggle.

Client side

I've downloaded filebeat on client side and extracted it. i've edited the filebeat.yml as

filebeat.inputs:
- input_type: log
  paths:
    - C:\Users\Charles\Desktop\DATA\BrentOilPrices.csv
document_type: test_log_csv
output.logstash:
  hosts: ["10.64.2.246:5044"]

I also tested it with

./filebeat test config

it return : Config Ok

Server side

edited logstash.conf as

input {
beats {
port =>5044
}
}

filter {

if "test_log_csv" in [type]
{
csv {
columns=>["Date","Price"]
separator=>","
}
mutate{
convert => ["Price","integer"]
}
date{
match=>["Date","d/MMM/yy"]
}
}
}

output {
if "test_log_csv" in [type]
{
elasticsearch
{
hosts=>"127.0.0.1:9200"
index=>"test_log_csv%{+d/MM/yy}"
}
}

Client side

I run

Start-Service filebeat

it returns nothing.

I checked my kibana and there are no logs . what did i miss?

Edited filebeat.yml at client side

filebeat.inputs:
- input_type: log
  paths:
    - 'C:\Users\Charles\Desktop\DATA\BrentOilPrices.csv'
fields:
document_type: test_log_csv
output.logstash:
  hosts: ["10.64.2.246:5044"]

解决方案

The document_type option was removed from Filebeat in version 6.X so the type field is not created anymore, since your conditionals are based on this field, your pipeline will not work. Also, you should try to use forward slashes (/) even on windows.

Try to change your config for the one below and test again.

filebeat.inputs:
- input_type: log
  paths:
    - 'C:/Users/Charles/Desktop/DATA/BrentOilPrices.csv'
fields:
  type: test_log_csv
fields_under_root: true
output.logstash:
  hosts: ["10.64.2.246:5044"]

The option fields_under_root: true will create the field type in the root of your document, if you remove this option, the field will be created as [fields][type] and you will need to change your conditionals to that field.

这篇关于使用Filebeat传输CSV文件的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆