Logstash异常预期在第1行第1列的#,输入,过滤器,输出之一 [英] Logstash exception Expected one of #, input, filter, output at line 1, column 1

查看:141
本文介绍了Logstash异常预期在第1行第1列的#,输入,过滤器,输出之一的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

当我在数据库(MySQL)中插入新数据时,Logstash不会动态添加它:在下面,您将看到logstash.conf(这是将elasticsearch与mysql连接的文件)

When I insert the new data in my database (MySQL), Logstash doesn't add it dynamically : Below you will see logstash.conf (it's the file who connected elasticsearch with mysql)

input {
  jdbc { 
    jdbc_connection_string => "jdbc:mysql://localhost:3306/blog" #Accès à la base de données 
    jdbc_user => "root"
    jdbc_password => ""
    jdbc_driver_library => "C:\Users\saidb\Downloads\mysql-connector-java-5.1.47\mysql-connector-java-5.1.47.jar"
    jdbc_driver_class => "com.mysql.jdbc.Driver"
    schedule => "* * * * *"
    statement => "SELECT * FROM blog_pro WHERE id > :sql_last_value"
    use_column_value => true
    tracking_column => "id"

    }
  }
output {
  elasticsearch {
   hosts => "localhost:9200"
   index => "blog_pro"
   document_type => "data"
  }
}

当我执行命令行logstash -f logstash.conf --debug时:

C:\logstash-6.5.4\bin>logstash -f logstash.conf --debug
Sending Logstash logs to C:/logstash-6.5.4/logs which is now configured via log4j2.properties
[2019-01-23T14:40:25,313][DEBUG][logstash.modules.scaffold] Found module {:module_name=>"fb_apache", :directory=>"C:/logstash-6.5.4/modules/fb_apache/configuration"}
[2019-01-23T14:40:25,329][DEBUG][logstash.plugins.registry] Adding plugin to the registry {:name=>"fb_apache", :type=>:modules, :class=>#<LogStash::Modules::Scaffold:0x25c03375 @directory="C:/logstash-6.5.4/modules/fb_apache/configuration", @module_name="fb_apache", @kibana_version_parts=["6", "0", "0"]>}
[2019-01-23T14:40:25,360][DEBUG][logstash.modules.scaffold] Found module {:module_name=>"netflow", :directory=>"C:/logstash-6.5.4/modules/netflow/configuration"}
[2019-01-23T14:40:25,360][DEBUG][logstash.plugins.registry] Adding plugin to the registry {:name=>"netflow", :type=>:modules, :class=>#<LogStash::Modules::Scaffold:0x31208cab @directory="C:/logstash-6.5.4/modules/netflow/configuration", @module_name="netflow", @kibana_version_parts=["6", "0", "0"]>}
[2019-01-23T14:40:25,454][DEBUG][logstash.runner          ] -------- Logstash Settings (* means modified) ---------
[2019-01-23T14:40:25,454][DEBUG][logstash.runner          ] node.name: "LAPTOP-74TV0043"
[2019-01-23T14:40:25,454][DEBUG][logstash.runner          ] *path.config: "logstash.conf"
[2019-01-23T14:40:25,454][DEBUG][logstash.runner          ] path.data: "C:/logstash-6.5.4/data"
[2019-01-23T14:40:25,454][DEBUG][logstash.runner          ] modules.cli: []
[2019-01-23T14:40:25,454][DEBUG][logstash.runner          ] modules: []
[2019-01-23T14:40:25,454][DEBUG][logstash.runner          ] modules_list: []
[2019-01-23T14:40:25,454][DEBUG][logstash.runner          ] modules_variable_list: []
[2019-01-23T14:40:25,454][DEBUG][logstash.runner          ] modules_setup: false
[2019-01-23T14:40:25,454][DEBUG][logstash.runner          ] config.test_and_exit: false
[2019-01-23T14:40:25,454][DEBUG][logstash.runner          ] config.reload.automatic: false
[2019-01-23T14:40:25,454][DEBUG][logstash.runner          ] config.reload.interval: 3000000000
[2019-01-23T14:40:25,454][DEBUG][logstash.runner          ] config.support_escapes: false
[2019-01-23T14:40:25,454][DEBUG][logstash.runner          ] config.field_reference.parser: "COMPAT"
[2019-01-23T14:40:25,454][DEBUG][logstash.runner          ] metric.collect: true
[2019-01-23T14:40:25,454][DEBUG][logstash.runner          ] pipeline.id: "main"
[2019-01-23T14:40:25,454][DEBUG][logstash.runner          ] pipeline.system: false
[2019-01-23T14:40:25,454][DEBUG][logstash.runner          ] pipeline.workers: 8
[2019-01-23T14:40:25,454][DEBUG][logstash.runner          ] pipeline.output.workers: 1
[2019-01-23T14:40:25,454][DEBUG][logstash.runner          ] pipeline.batch.size: 125
[2019-01-23T14:40:25,454][DEBUG][logstash.runner          ] pipeline.batch.delay: 50
[2019-01-23T14:40:25,454][DEBUG][logstash.runner          ] pipeline.unsafe_shutdown: false
[2019-01-23T14:40:25,469][DEBUG][logstash.runner          ] pipeline.java_execution: false
[2019-01-23T14:40:25,469][DEBUG][logstash.runner          ] pipeline.reloadable: true
[2019-01-23T14:40:25,469][DEBUG][logstash.runner          ] path.plugins: []
[2019-01-23T14:40:25,469][DEBUG][logstash.runner          ] config.debug: false
[2019-01-23T14:40:25,469][DEBUG][logstash.runner          ] *log.level: "debug" (default: "info")
[2019-01-23T14:40:25,469][DEBUG][logstash.runner          ] version: false
[2019-01-23T14:40:25,469][DEBUG][logstash.runner          ] help: false
[2019-01-23T14:40:25,469][DEBUG][logstash.runner          ] log.format: "plain"
[2019-01-23T14:40:25,469][DEBUG][logstash.runner          ] http.host: "127.0.0.1"
[2019-01-23T14:40:25,469][DEBUG][logstash.runner          ] http.port: 9600..9700
[2019-01-23T14:40:25,469][DEBUG][logstash.runner          ] http.environment: "production"
[2019-01-23T14:40:25,469][DEBUG][logstash.runner          ] queue.type: "memory"
[2019-01-23T14:40:25,469][DEBUG][logstash.runner          ] queue.drain: false
[2019-01-23T14:40:25,469][DEBUG][logstash.runner          ] queue.page_capacity: 67108864
[2019-01-23T14:40:25,469][DEBUG][logstash.runner          ] queue.max_bytes: 1073741824
[2019-01-23T14:40:25,469][DEBUG][logstash.runner          ] queue.max_events: 0
[2019-01-23T14:40:25,469][DEBUG][logstash.runner          ] queue.checkpoint.acks: 1024
[2019-01-23T14:40:25,469][DEBUG][logstash.runner          ] queue.checkpoint.writes: 1024
[2019-01-23T14:40:25,469][DEBUG][logstash.runner          ] queue.checkpoint.interval: 1000
[2019-01-23T14:40:25,469][DEBUG][logstash.runner          ] dead_letter_queue.enable: false
[2019-01-23T14:40:25,469][DEBUG][logstash.runner          ] dead_letter_queue.max_bytes: 1073741824
[2019-01-23T14:40:25,469][DEBUG][logstash.runner          ] slowlog.threshold.warn: -1
[2019-01-23T14:40:25,469][DEBUG][logstash.runner          ] slowlog.threshold.info: -1
[2019-01-23T14:40:25,469][DEBUG][logstash.runner          ] slowlog.threshold.debug: -1
[2019-01-23T14:40:25,469][DEBUG][logstash.runner          ] slowlog.threshold.trace: -1
[2019-01-23T14:40:25,469][DEBUG][logstash.runner          ] keystore.classname: "org.logstash.secret.store.backend.JavaKeyStore"
[2019-01-23T14:40:25,469][DEBUG][logstash.runner          ] keystore.file: "C:/logstash-6.5.4/config/logstash.keystore"
[2019-01-23T14:40:25,469][DEBUG][logstash.runner          ] path.queue: "C:/logstash-6.5.4/data/queue"
[2019-01-23T14:40:25,469][DEBUG][logstash.runner          ] path.dead_letter_queue: "C:/logstash-6.5.4/data/dead_letter_queue"
[2019-01-23T14:40:25,469][DEBUG][logstash.runner          ] path.settings: "C:/logstash-6.5.4/config"
[2019-01-23T14:40:25,469][DEBUG][logstash.runner          ] path.logs: "C:/logstash-6.5.4/logs"
[2019-01-23T14:40:25,469][DEBUG][logstash.runner          ] xpack.management.enabled: false
[2019-01-23T14:40:25,469][DEBUG][logstash.runner          ] xpack.management.logstash.poll_interval: 5000000000
[2019-01-23T14:40:25,469][DEBUG][logstash.runner          ] xpack.management.pipeline.id: ["main"]
[2019-01-23T14:40:25,469][DEBUG][logstash.runner          ] xpack.management.elasticsearch.username: "logstash_system"
[2019-01-23T14:40:25,485][DEBUG][logstash.runner          ] xpack.management.elasticsearch.url: ["https://localhost:9200"]
[2019-01-23T14:40:25,488][DEBUG][logstash.runner          ] xpack.management.elasticsearch.ssl.verification_mode: "certificate"
[2019-01-23T14:40:25,488][DEBUG][logstash.runner          ] xpack.management.elasticsearch.sniffing: false
[2019-01-23T14:40:25,489][DEBUG][logstash.runner          ] xpack.monitoring.enabled: false
[2019-01-23T14:40:25,489][DEBUG][logstash.runner          ] xpack.monitoring.elasticsearch.url: ["http://localhost:9200"]
[2019-01-23T14:40:25,490][DEBUG][logstash.runner          ] xpack.monitoring.collection.interval: 10000000000
[2019-01-23T14:40:25,490][DEBUG][logstash.runner          ] xpack.monitoring.collection.timeout_interval: 600000000000
[2019-01-23T14:40:25,491][DEBUG][logstash.runner          ] xpack.monitoring.elasticsearch.username: "logstash_system"
[2019-01-23T14:40:25,491][DEBUG][logstash.runner          ] xpack.monitoring.elasticsearch.ssl.verification_mode: "certificate"
[2019-01-23T14:40:25,491][DEBUG][logstash.runner          ] xpack.monitoring.elasticsearch.sniffing: false
[2019-01-23T14:40:25,492][DEBUG][logstash.runner          ] xpack.monitoring.collection.pipeline.details.enabled: true
[2019-01-23T14:40:25,492][DEBUG][logstash.runner          ] xpack.monitoring.collection.config.enabled: true
[2019-01-23T14:40:25,492][DEBUG][logstash.runner          ] node.uuid: ""
[2019-01-23T14:40:25,493][DEBUG][logstash.runner          ] --------------- Logstash Settings -------------------
[2019-01-23T14:40:25,517][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2019-01-23T14:40:25,579][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"6.5.4"}
[2019-01-23T14:40:25,611][DEBUG][logstash.agent           ] Setting global FieldReference parsing mode: COMPAT
[2019-01-23T14:40:25,657][DEBUG][logstash.agent           ] Setting up metric collection
[2019-01-23T14:40:25,720][DEBUG][logstash.instrument.periodicpoller.os] Starting {:polling_interval=>5, :polling_timeout=>120}
[2019-01-23T14:40:25,736][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2019-01-23T14:40:25,892][DEBUG][logstash.instrument.periodicpoller.jvm] Starting {:polling_interval=>5, :polling_timeout=>120}
[2019-01-23T14:40:26,048][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2019-01-23T14:40:26,048][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2019-01-23T14:40:26,064][DEBUG][logstash.instrument.periodicpoller.persistentqueue] Starting {:polling_interval=>5, :polling_timeout=>120}
[2019-01-23T14:40:26,079][DEBUG][logstash.instrument.periodicpoller.deadletterqueue] Starting {:polling_interval=>5, :polling_timeout=>120}
[2019-01-23T14:40:26,157][DEBUG][logstash.agent           ] Starting agent
[2019-01-23T14:40:26,204][DEBUG][logstash.config.source.local.configpathloader] Skipping the following files while reading config since they don't match the specified glob pattern {:files=>["C:/logstash-6.5.4/bin/benchmark.sh", "C:/logstash-6.5.4/bin/cpdump", "C:/logstash-6.5.4/bin/dependencies-report", "C:/logstash-6.5.4/bin/ingest-convert.sh", "C:/logstash-6.5.4/bin/logstash", "C:/logstash-6.5.4/bin/logstash-keystore", "C:/logstash-6.5.4/bin/logstash-keystore.bat", "C:/logstash-6.5.4/bin/logstash-plugin", "C:/logstash-6.5.4/bin/logstash-plugin.bat", "C:/logstash-6.5.4/bin/logstash.bat", "C:/logstash-6.5.4/bin/logstash.lib.sh", "C:/logstash-6.5.4/bin/pqcheck", "C:/logstash-6.5.4/bin/pqrepair", "C:/logstash-6.5.4/bin/ruby", "C:/logstash-6.5.4/bin/setup.bat", "C:/logstash-6.5.4/bin/system-install"]}
[2019-01-23T14:40:26,220][DEBUG][logstash.config.source.local.configpathloader] Reading config file {:config_file=>"C:/logstash-6.5.4/bin/logstash.conf"}
[2019-01-23T14:40:26,282][DEBUG][logstash.agent           ] Converging pipelines state {:actions_count=>1}
[2019-01-23T14:40:26,298][DEBUG][logstash.agent           ] Executing action {:action=>LogStash::PipelineAction::Create/pipeline_id:main}
[2019-01-23T14:40:26,502][ERROR][logstash.agent           ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, input, filter, output at line 1, column 1 (byte 1) after ", :backtrace=>["C:/logstash-6.5.4/logstash-core/lib/logstash/compiler.rb:41:in `compile_imperative'", "C:/logstash-6.5.4/logstash-core/lib/logstash/compiler.rb:49:in `compile_graph'", "C:/logstash-6.5.4/logstash-core/lib/logstash/compiler.rb:11:in `block in compile_sources'", "org/jruby/RubyArray.java:2486:in `map'", "C:/logstash-6.5.4/logstash-core/lib/logstash/compiler.rb:10:in `compile_sources'", "org/logstash/execution/AbstractPipelineExt.java:149:in `initialize'", "C:/logstash-6.5.4/logstash-core/lib/logstash/pipeline.rb:22:in `initialize'", "C:/logstash-6.5.4/logstash-core/lib/logstash/pipeline.rb:90:in `initialize'", "C:/logstash-6.5.4/logstash-core/lib/logstash/pipeline_action/create.rb:42:in `block in execute'", "C:/logstash-6.5.4/logstash-core/lib/logstash/agent.rb:92:in `block in exclusive'", "org/jruby/ext/thread/Mutex.java:148:in `synchronize'", "C:/logstash-6.5.4/logstash-core/lib/logstash/agent.rb:92:in `exclusive'", "C:/logstash-6.5.4/logstash-core/lib/logstash/pipeline_action/create.rb:38:in `execute'", "C:/logstash-6.5.4/logstash-core/lib/logstash/agent.rb:317:in `block in converge_state'"]}
[2019-01-23T14:40:26,580][DEBUG][logstash.agent           ] Starting puma
[2019-01-23T14:40:26,580][DEBUG][logstash.agent           ] Trying to start WebServer {:port=>9600}
[2019-01-23T14:40:26,595][DEBUG][logstash.instrument.periodicpoller.os] Stopping
[2019-01-23T14:40:26,611][DEBUG][logstash.instrument.periodicpoller.jvm] Stopping
[2019-01-23T14:40:26,611][DEBUG][logstash.instrument.periodicpoller.persistentqueue] Stopping
[2019-01-23T14:40:26,611][DEBUG][logstash.instrument.periodicpoller.deadletterqueue] Stopping
[2019-01-23T14:40:26,642][DEBUG][logstash.api.service     ] [api-service] start
[2019-01-23T14:40:27,033][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2019-01-23T14:40:31,798][DEBUG][logstash.agent           ] Shutting down all pipelines {:pipelines_count=>0}
[2019-01-23T14:40:31,798][DEBUG][logstash.agent           ] Converging pipelines state {:actions_count=>0}

推荐答案

Expected one of #, input, filter, output at line 1, column 1 (byte 1) after异常通常在以下情况下发生:

The exception Expected one of #, input, filter, output at line 1, column 1 (byte 1) after usually occurs when:

  • 配置文件格式不正确-将logstash.conf文件格式设置为UTF-8
  • 当配置文件在输入,过滤器或输出块之间包含非法字符时-例如,删除文件开头的所有空格和换行符
  • 注释中使用了无效字符-删除注释或将文件格式化为UTF-8
  • 键和值之间使用了不正确的分隔符-检查是否每个键和值都由=>分隔
  • Logstash尝试加载其他配置文件-确保配置文件位于Logstash bin文件夹中,或者在运行Logstash时提供完整路径-f PATH_TO/logstash.conf
  • The config file is in an incorrect format - format logstash.conf file to UTF-8
  • When the config file contains illegal characters between input, filter or output blocks - for example remove all whitespaces and new lines at the beginning of the file
  • An invalid character has been used in comments - remove comments or format the file to UTF-8
  • An incorrect separator has been used between the key and the value - check if every key and value is separated by =>
  • Logstash tries to load different config file - make sure that the config file is either in Logstash bin folder or when running Logstash provide the full path -f PATH_TO/logstash.conf

这篇关于Logstash异常预期在第1行第1列的#,输入,过滤器,输出之一的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆