在弹性搜索中将数据从一个索引复制到另一个 [英] Copy data from one index to another in elastic search

查看:427
本文介绍了在弹性搜索中将数据从一个索引复制到另一个的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

如何在Elastic Search中将数据从一个索引复制到另一个索引(可能存在于同一个主机或不同的主机中)?



输出



 阅读配置文件{:file =>logstash / agent.rb,:level =>:debug,:line => 309,:method =>local_config} 
编译的流水线代码:
@inputs = []
@filters = []
@outputs = []
@periodic_flushers = []
@shutdown_flushers = []

@ input_elasticsearch_1 = plugin(input,elasticsearch,LogStash :: Util.hash_merge_many({hosts=> (input hostname)},{port=>(9200)},{index=>(.kibana)},{size=> 500},{scroll =>(5m)},{docinfo=>(true)}))

@inputs< @ input_elasticsearch_1

@ output_elasticsearch_2 = plugin(output,elasticsearch,LogStash :: Util.hash_merge_many({host=>(output hostname)},{port= > 9200},{protocol=>(http)},{manage_template=>(false)},{index=>(order-logs-sample)} ,{document_type=>(logs)},{document_id=>(%{id})},{workers=> 1}))

@outputs<< @ output_elasticsearch_2

def filter_func(event)
events = [event]
@ logger.debug? &安培;&安培; @ logger.debug(filter received,:event => event.to_hash)
events
end
def output_func(event)
@ logger.debug? &安培;&安培; @ logger.debug(output received,:event => event.to_hash)
@ output_elasticsearch_2.handle(event)

end {:level =>:debug,:file =logstash / pipeline.rb,:line =>29,:method =>initialize}
未在命名空间中定义的插件,检查插件文件{:type =>输入,:name =>elasticsearch,:path =>logstash / inputs / elasticsearch,:level =>:debug,:file =>logstash / plugin.rb,:line => ;133,:method =>lookup}
未在命名空间中定义的插件,检查插件文件{:type =>codec,:name =>json,:path = >logstash / codecs / json,:level =>:debug,:file =>logstash / plugin.rb,:line =>133,:method =>lookup}
config LogStash :: Codecs :: JSON / @ charset =UTF-8{:level =>:debug,:file =>logstash / config / mixin.rb,:line => 112,:method =>config_init}
config LogStash :: Inputs :: Elasticsearch / @ hosts = [ccwlog-stg1-01] {:level =>:debug,:file => ;logstash / config / mixin.rb,:line =>112,:method = "config_init}
config LogStash :: Inputs :: Elasticsearch / @ port = 9200 {:level =>:debug,:file =>logstash / config / mixin.rb,:line = "112,:method ="config_init}
config LogStash :: Inputs :: Elasticsearch / @ index =.kibana{:level =>:debug,:file => logstash / config / mixin.rb,:line =>112,:method =>config_init}
config LogStash :: Inputs :: Elasticsearch / @ size = 500 {:level => :debug =:file =>logstash / config / mixin.rb,:line =>112,:method =>config_init}
config LogStash :: Inputs :: Elasticsearch / @ scroll =5m{:level =>:debug,:file =>logstash / config / mixin.rb,:line =>112,:method =>config_init}
config LogStash :: Inputs :: Elasticsearch / @ docinfo = true {:level =>:debug,:file =>logstash / config / mixin.rb,:line =>112,:method = "config_init}
config LogStash :: Inputs :: Elasticsearch / @ debug = false {:level =>:debug,:file =>logstash / config / mixin.rb,:line = "112,:method ="config_init}
config日志Stash :: Inputs :: Elasticsearch / @ codec =< LogStash :: Codecs :: JSON charset =>UTF-8> {:level =>:debug,:file =>logstash / config / mixin.rb,:line =>112,:method =>config_init}
config LogStash ::输入:: Elasticsearch / @ add_field = {} {:level =>:debug,:file =>logstash / config / mixin.rb,:line =>112,:method =>config_init }
config LogStash :: Inputs :: Elasticsearch / @ query ={\query\:{\match_all\:{}}}{:level =>:debug ,file =>logstash / config / mixin.rb,:line =>112,:method =>config_init}
config LogStash :: Inputs :: Elasticsearch / @ scan = true {:level =>:debug,:file =>logstash / config / mixin.rb,:line =>112,:method =>config_init}
config LogStash: :Inputs :: Elasticsearch / @ docinfo_target =@metadata{:level =>:debug,:file =>logstash / config / mixin.rb,:line =>112,:method => ;config_init}
config LogStash :: Inputs :: Elasticsearch / @ docinfo_fields = [_index,_type,_id] {:level =>:debug,:file =>logstash /config/mixin.rb,:line =>112,:method =>config_init }
config LogStash :: Inputs :: Elasticsearch / @ ssl = false {:level =>:debug,:file =>logstash / config / mixin.rb,:line =>112 ,:method =>config_init}
未在命名空间中定义插件,检查插件文件{:type =>output,:name =>elasticsearch,:path => logstash / outputs / elasticsearch,:level =>:debug,:file =>logstash / plugin.rb,:line =>133,:method =>lookup}
'[DEPRECATED]使用`require'concurrent``而不是`require'concurrent_ruby'`
[2016-01-22 03:49:34.451] WARN - 并发:[DEPRECATED] Java 7已被弃用,请使用Java 8.
Java 7的支持只是最大的努力,它可能无法正常工作。它将在下一个版本(1.0)中删除。
未在命名空间中定义插件,检查插件文件{:type =>codec,:name =>plain,:path =>logstash / codecs / plain,:level => ;; debug;:file =>logstash / plugin.rb,:line =>133,:method =>lookup}
config LogStash :: Codecs :: Plain / @ charset =UTF-8{:level =>:debug,:file =>logstash / config / mixin.rb,:line =>112,:method =>config_init}
config LogStash :: Outputs :: ElasticSearch / @ host = [ccwlog-stg1-01] {:level =>:debug,:file =>logstash / config / mixin.rb,:line = "112,:method =>config_init}
config LogStash :: Outputs :: ElasticSearch / @ port = 9200 {:level =>:debug,:file =>logstash / config /mixin.rb,:line =>112,:method =>config_init}
config LogStash :: Outputs :: ElasticSearch / @ protocol =http{:level => ;:调试,:file =>logstash / config / mixin.rb,:line =>112,:method =>config_init}
config LogStash :: Outputs :: ElasticSearch / @ manage_template = false {:level =>:debug,:file =>logstash / config /mixin.rb,:line =>112,:method =>config_init}
config LogStash :: Outputs :: ElasticSearch / @ index =order-logs-sample{:level =>:debug,:file =>logstash / config / mixin.rb,:line =>112,:method =>config_init}
config LogStash :: Outputs :: ElasticSearch / @ document_type =logs{:level =>:debug,:file =>logstash / config / mixin.rb,:line =>112,:method =>config_init}
config LogStash :: Outputs :: ElasticSearch / @ document_id =%{id}{:level =>:debug,:file =>logstash / config / mixin.rb,:line => ;112,:method ="config_init}
config LogStash :: Outputs :: ElasticSearch / @ workers = 1 {:level =>:debug,:file =>logstash / config / mixin.rb,:line =>112,:method =>config_init}
config LogStash :: Outputs :: ElasticSearch / @ type ={:level => :file =>logstash / config / mixin.rb,:line =>112,:method =>config_init}
config LogStash :: Outputs :: ElasticSearch / @ tags = [ ] {:level =>:debug,:file =>logstash / config / mixin。 rb,:line =>112,:method =>config_init}
config LogStash :: Outputs :: ElasticSearch / @ exclude_tags = [] {:level =>:debug, =logstash / config / mixin.rb,:line =>112,:method =>config_init}
config LogStash :: Outputs :: ElasticSearch / @ codec =< LogStash :: Codecs :: Plain charset =>UTF-8> {:level =>:debug,:file =>logstash / config / mixin.rb,:line =>112,:method =>config_init}
config LogStash ::输出:: ElasticSearch / @ template_name =logstash{:level =>:debug,:file =>logstash / config / mixin.rb,:line =>112,:method => config_init}
config LogStash :: Outputs :: ElasticSearch / @ template_overwrite = false {:level =>:debug,:file =>logstash / config / mixin.rb,:line => 112,:method =>config_init}
config LogStash :: Outputs :: ElasticSearch / @ embedded = false {:level =>:debug,:file =>logstash / config / mixin。 rb,:line =>112,:method =>config_init}
config LogStash :: Outputs :: ElasticSearch / @ embedded_http_port =9200-9300{:level => ,:file =>logstash / config / mixin.rb,:line =>112,:method =>config_init}
config LogStash :: Outputs :: ElasticSearch / @ max_inflight_requests = 50 {:level =>:debug,:file =>logstash / config / mixin.rb,:line =>112,:method =>config_init}
config LogStash: :Outpu ts :: ElasticSearch / @ flush_size = 5000 {:level =>:debug,:file =>logstash / config / mixin.rb,:line =>112,:method =>config_init }
config LogStash :: Outputs :: ElasticSearch / @ idle_flush_time = 1 {:level =>:debug,:file =>logstash / config / mixin.rb,:line =>112 ,:method =>config_init}
config LogStash :: Outputs :: ElasticSearch / @ action =index{:level =>:debug,:file =>logstash / config / mixin。 rb,:line =>112,:method =>config_init}
config LogStash :: Outputs :: ElasticSearch / @ path =/{:level => file =>logstash / config / mixin.rb,:line =>112,:method =>config_init}
config LogStash :: Outputs :: ElasticSearch / @ ssl = :level =>:debug,:file =>logstash / config / mixin.rb,:line =>112,:method ="config_init}
config LogStash :: Outputs :: ElasticSearch / @ ssl_certificate_verification = true {:level =>:debug,:file =>logstash / config / mixin.rb,:line =>112,:method =>config_init}
配置LogStash :: Out puts :: ElasticSearch / @ sniffing = false {:level =>:debug,:file =>logstash / config / mixin.rb,:line =>112,:method =>config_init }
config LogStash :: Outputs :: ElasticSearch / @ max_retries = 3 {:level =>:debug,:file =>logstash / config / mixin.rb,:line =>112 ,:method =>config_init}
config LogStash :: Outputs :: ElasticSearch / @ retry_max_items = 5000 {:level =>:debug,:file =>logstash / config / mixin.rb ,:line =>112,:method ="config_init}
config LogStash :: Outputs :: ElasticSearch / @ retry_max_interval = 5 {:level =>:debug,:file => logstash / config / mixin.rb,:line =>112,:method =>config_init}
规范化http路径{:path =>/,:normalized = /,:level =>:debug,:file =>logstash / outputs / elasticsearch.rb,:line =>342,:method =>register}
创建客户端到弹性搜索服务器在ccwlog-stg1-01:{:level =>:info,:file =>logstash / outputs / elasticsearch.rb,:line =>422,:method =>注册}
插件是已完成{:plugin =>< LogStash :: Inputs :: Elasticsearch hosts => [ccwlog-stg1-01],port => 9200,index =>。kibana,size => 500 ,scroll =>5m,docinfo => true,debug => false,codec =>< LogStash :: Codecs :: JSON charset =>UTF-8> query => {\query\:{\match_all\:{}}},scan => true,docinfo_target =>@ metadata,docinfo_fields => [_ index _type,_id],ssl => false>,level =>:info,:file =>logstash / plugin.rb,:line =>61,:method => finished}
New Elasticsearch输出{:cluster => nil,:host => [ccwlog-stg1-01],:port => 9200,:embedded => =>http,:level =>:info,:file =>logstash / outputs / elasticsearch.rb,:line =>439,:method =>register}
流水线启动{:level =>:info,:file =>logstash / pipeline.rb,:line =>87,:method =>run}
Logstash启动已完成
输出已收到{:event => {title=>logindex,timeFieldName=>@ timestamp ,fields=>[{\name\:\caller\,\type\:\string\,\count\ :0,\ scripted\:假,\ indexed\:真,\ analyzed\:真,\ doc_values\:假},{\name\\ \\ :\ _source\ \ type\:\ _source\,\ count\:0,\ scripted\:假,\ indexed\ :假,\ analyzed\ :假,\ doc_values\ :假},{\ name\ :\ exception\ \ type\\ \\ :\ string\,\ count\:0,\ scripted\:假的,\ indexed\:真实的,\ analyzed\:真,\ doc_values\:假},{\ name\:\ type\,\ type\:\ string\,\count\\ \\ :0,\ scripted\ :假,\ indexed\ :真,\ analyzed\ :真,\ doc_values\ :假},{\ name\ :\ @ version\ \ type\:\ string\,\ count\ :0,\ scripted\ :假的,\ indexed\ :真实的,\ analyzed\ :真实的,\ doc_values\ :假},{\ 名称\ :\ serviceName\,\ type\:\ string\,\ count\:0,\ scripted\:假,\ indexed\:真,\ analyzed\:真,\ doc_values\:假},{\ name\:\ _type\,\型\ :\ string\,\ count\:0,\ scripted\:假的,\ indexed\:真实的,\ analyzed\:假的,\ doc_values\:假},{\ name\:\ _id\,\ type\:\ string\,\计\ :0,\ scripted\ :假,\ indexed\ :假,\ analyzed\ :假,\ doc_values\:假},{\ name\:\ userId\,\ type\:\ string\,\ count\:0,\ scripted\:假, \ indexed\:真实的,\ analyzed\:真实的,\ doc_values\:假}, {\ name\:\ path\,\ type\:\ string\,\ count\:0,\ scripted\ :假的,\ indexed\:真实的,\ analyzed\:真实的,\ doc_values\:假},{\ name\:\ orderId\ \ type\:\ string\,\ count\:0,\ scripted\:假的,\ indexed\:真实的,\ analyzed\ :真实的,\ doc_values\ :假},{\ name\ :\ dc\,\ type\:\ string\ \ count\:0,\ scripted\:假的,\ indexed\:真实的,\ analyzed\:真实的,\ doc_values\:假的},{\ name\:\ tags\,\ type\:\ string\,\ count\:0,\scripted\\ \\ :虚假,\ indexed\ :真实的,\ analyzed\ :真实的,\ doc_values\ :假},{\ name\ :\ host\\ \\,\ type\:\ string\,\ count\:0,\ scripted\:假的, \\ indexed\:真实的,\ analyzed\:真实的,\ doc_values\:假},{\ name\:\ _index\,\ type\ :\ string\,\ count\:0,\ scripted\:假,\ indexed\:假,\ analyzed\ :假,\ doc_values\:假},{\ name\:\ elapsedTime\,\ type\:\ number\,\ count\ :0,\ scripted\ :假,\ indexed\ :真,\ analyzed\ :假,\ doc_values\:假},{\\ \\ name\:\ message\,\ type\:\ string\,\ count\:0,\ scripted\:假\ indexed\:真实的,\ analyzed\:真实的,\ doc_values\:假},{\ name\:\ @ timestamp\ \ type\:\ date\,\ count\:0,\ scripted\:假的,\ indexed\:真实的,\分析\ :假的,\ doc_values\ :假},{\ name\:\ performanceRequest\,\ type\:\ string\,\ count\:0,\ scripted\:假,\ indexed\: true,\analyze\:true,\doc_values\:false}],@version=>1,@timestamp=>2016-01-22T11: 49:35.268Z},:level =>:debug,:file =>(eval),:line =>21,:method =>output_func}


解决方案

一个简单的方法是使用 Logstash elasticsearch 输入插件和 elasticsearch 输出插件。



此解决方案的好处是您不必重写样板代码来扫描/滚动和批量重新索引,这正是Logstash已经提供的。 p>

安装Logstash ,您可以创建一个配置文件 copy.conf ,如下所示:

  input {
elasticsearch {
hosts => [localhost:9200]< ---源ES主机
index => source_index
}
}
过滤器{
mutate {
remove_field => [@version,@timestamp]< ---删除由Logstash
}添加的字段
}
输出{
elasticsearch {
hosts => ; [localhost:9200]< ---目标ES主机
manage_template => false
index => target_index
document_id => %{id}< ---您的ID字段的名称
workers => 1
}
}

然后设置正确的值(源/目标主机+源/目标索引),您可以使用 bin / logstash -f copy.conf


How to copy data from one index to another index in Elastic Search (which may be present in same host or different host) ?

Output

Reading config file {:file=>"logstash/agent.rb", :level=>:debug, :line=>"309", :method=>"local_config"}
Compiled pipeline code:
        @inputs = []
        @filters = []
        @outputs = []
        @periodic_flushers = []
        @shutdown_flushers = []

          @input_elasticsearch_1 = plugin("input", "elasticsearch", LogStash::Util.hash_merge_many({ "hosts" => ("input hostname") }, { "port" => ("9200") }, { "index" => (".kibana") }, { "size" => 500 }, { "scroll" => ("5m") }, { "docinfo" => ("true") }))

          @inputs << @input_elasticsearch_1

          @output_elasticsearch_2 = plugin("output", "elasticsearch", LogStash::Util.hash_merge_many({ "host" => ("output hostname") }, { "port" => 9200 }, { "protocol" => ("http") }, { "manage_template" => ("false") }, { "index" => ("order-logs-sample") }, { "document_type" => ("logs") }, { "document_id" => ("%{id}") }, { "workers" => 1 }))

          @outputs << @output_elasticsearch_2

  def filter_func(event)
    events = [event]
    @logger.debug? && @logger.debug("filter received", :event => event.to_hash)
    events
  end
  def output_func(event)
    @logger.debug? && @logger.debug("output received", :event => event.to_hash)
    @output_elasticsearch_2.handle(event)

  end {:level=>:debug, :file=>"logstash/pipeline.rb", :line=>"29", :method=>"initialize"}
Plugin not defined in namespace, checking for plugin file {:type=>"input", :name=>"elasticsearch", :path=>"logstash/inputs/elasticsearch", :level=>:debug, :file=>"logstash/plugin.rb", :line=>"133", :method=>"lookup"}
Plugin not defined in namespace, checking for plugin file {:type=>"codec", :name=>"json", :path=>"logstash/codecs/json", :level=>:debug, :file=>"logstash/plugin.rb", :line=>"133", :method=>"lookup"}
config LogStash::Codecs::JSON/@charset = "UTF-8" {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
config LogStash::Inputs::Elasticsearch/@hosts = ["ccwlog-stg1-01"] {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
config LogStash::Inputs::Elasticsearch/@port = 9200 {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
config LogStash::Inputs::Elasticsearch/@index = ".kibana" {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
config LogStash::Inputs::Elasticsearch/@size = 500 {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
config LogStash::Inputs::Elasticsearch/@scroll = "5m" {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
config LogStash::Inputs::Elasticsearch/@docinfo = true {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
config LogStash::Inputs::Elasticsearch/@debug = false {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
config LogStash::Inputs::Elasticsearch/@codec = <LogStash::Codecs::JSON charset=>"UTF-8"> {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
config LogStash::Inputs::Elasticsearch/@add_field = {} {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
config LogStash::Inputs::Elasticsearch/@query = "{\"query\": { \"match_all\": {} } }" {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
config LogStash::Inputs::Elasticsearch/@scan = true {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
config LogStash::Inputs::Elasticsearch/@docinfo_target = "@metadata" {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
config LogStash::Inputs::Elasticsearch/@docinfo_fields = ["_index", "_type", "_id"] {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
config LogStash::Inputs::Elasticsearch/@ssl = false {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
Plugin not defined in namespace, checking for plugin file {:type=>"output", :name=>"elasticsearch", :path=>"logstash/outputs/elasticsearch", :level=>:debug, :file=>"logstash/plugin.rb", :line=>"133", :method=>"lookup"}
'[DEPRECATED] use `require 'concurrent'` instead of `require 'concurrent_ruby'`
[2016-01-22 03:49:34.451]  WARN -- Concurrent: [DEPRECATED] Java 7 is deprecated, please use Java 8.
Java 7 support is only best effort, it may not work. It will be removed in next release (1.0).
Plugin not defined in namespace, checking for plugin file {:type=>"codec", :name=>"plain", :path=>"logstash/codecs/plain", :level=>:debug, :file=>"logstash/plugin.rb", :line=>"133", :method=>"lookup"}
config LogStash::Codecs::Plain/@charset = "UTF-8" {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
config LogStash::Outputs::ElasticSearch/@host = ["ccwlog-stg1-01"] {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
config LogStash::Outputs::ElasticSearch/@port = 9200 {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
config LogStash::Outputs::ElasticSearch/@protocol = "http" {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
config LogStash::Outputs::ElasticSearch/@manage_template = false {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
config LogStash::Outputs::ElasticSearch/@index = "order-logs-sample" {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
config LogStash::Outputs::ElasticSearch/@document_type = "logs" {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
config LogStash::Outputs::ElasticSearch/@document_id = "%{id}" {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
config LogStash::Outputs::ElasticSearch/@workers = 1 {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
config LogStash::Outputs::ElasticSearch/@type = "" {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
config LogStash::Outputs::ElasticSearch/@tags = [] {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
config LogStash::Outputs::ElasticSearch/@exclude_tags = [] {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
config LogStash::Outputs::ElasticSearch/@codec = <LogStash::Codecs::Plain charset=>"UTF-8"> {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
config LogStash::Outputs::ElasticSearch/@template_name = "logstash" {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
config LogStash::Outputs::ElasticSearch/@template_overwrite = false {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
config LogStash::Outputs::ElasticSearch/@embedded = false {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
config LogStash::Outputs::ElasticSearch/@embedded_http_port = "9200-9300" {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
config LogStash::Outputs::ElasticSearch/@max_inflight_requests = 50 {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
config LogStash::Outputs::ElasticSearch/@flush_size = 5000 {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
config LogStash::Outputs::ElasticSearch/@idle_flush_time = 1 {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
config LogStash::Outputs::ElasticSearch/@action = "index" {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
config LogStash::Outputs::ElasticSearch/@path = "/" {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
config LogStash::Outputs::ElasticSearch/@ssl = false {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
config LogStash::Outputs::ElasticSearch/@ssl_certificate_verification = true {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
config LogStash::Outputs::ElasticSearch/@sniffing = false {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
config LogStash::Outputs::ElasticSearch/@max_retries = 3 {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
config LogStash::Outputs::ElasticSearch/@retry_max_items = 5000 {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
config LogStash::Outputs::ElasticSearch/@retry_max_interval = 5 {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
Normalizing http path {:path=>"/", :normalized=>"/", :level=>:debug, :file=>"logstash/outputs/elasticsearch.rb", :line=>"342", :method=>"register"}
Create client to elasticsearch server on ccwlog-stg1-01: {:level=>:info, :file=>"logstash/outputs/elasticsearch.rb", :line=>"422", :method=>"register"}
Plugin is finished {:plugin=><LogStash::Inputs::Elasticsearch hosts=>["ccwlog-stg1-01"], port=>9200, index=>".kibana", size=>500, scroll=>"5m", docinfo=>true, debug=>false, codec=><LogStash::Codecs::JSON charset=>"UTF-8">, query=>"{\"query\": { \"match_all\": {} } }", scan=>true, docinfo_target=>"@metadata", docinfo_fields=>["_index", "_type", "_id"], ssl=>false>, :level=>:info, :file=>"logstash/plugin.rb", :line=>"61", :method=>"finished"}
New Elasticsearch output {:cluster=>nil, :host=>["ccwlog-stg1-01"], :port=>9200, :embedded=>false, :protocol=>"http", :level=>:info, :file=>"logstash/outputs/elasticsearch.rb", :line=>"439", :method=>"register"}
Pipeline started {:level=>:info, :file=>"logstash/pipeline.rb", :line=>"87", :method=>"run"}
Logstash startup completed
output received {:event=>{"title"=>"logindex", "timeFieldName"=>"@timestamp", "fields"=>"[{\"name\":\"caller\",\"type\":\"string\",\"count\":0,\"scripted\":false,\"indexed\":true,\"analyzed\":true,\"doc_values\":false},{\"name\":\"_source\",\"type\":\"_source\",\"count\":0,\"scripted\":false,\"indexed\":false,\"analyzed\":false,\"doc_values\":false},{\"name\":\"exception\",\"type\":\"string\",\"count\":0,\"scripted\":false,\"indexed\":true,\"analyzed\":true,\"doc_values\":false},{\"name\":\"type\",\"type\":\"string\",\"count\":0,\"scripted\":false,\"indexed\":true,\"analyzed\":true,\"doc_values\":false},{\"name\":\"@version\",\"type\":\"string\",\"count\":0,\"scripted\":false,\"indexed\":true,\"analyzed\":true,\"doc_values\":false},{\"name\":\"serviceName\",\"type\":\"string\",\"count\":0,\"scripted\":false,\"indexed\":true,\"analyzed\":true,\"doc_values\":false},{\"name\":\"_type\",\"type\":\"string\",\"count\":0,\"scripted\":false,\"indexed\":true,\"analyzed\":false,\"doc_values\":false},{\"name\":\"_id\",\"type\":\"string\",\"count\":0,\"scripted\":false,\"indexed\":false,\"analyzed\":false,\"doc_values\":false},{\"name\":\"userId\",\"type\":\"string\",\"count\":0,\"scripted\":false,\"indexed\":true,\"analyzed\":true,\"doc_values\":false},{\"name\":\"path\",\"type\":\"string\",\"count\":0,\"scripted\":false,\"indexed\":true,\"analyzed\":true,\"doc_values\":false},{\"name\":\"orderId\",\"type\":\"string\",\"count\":0,\"scripted\":false,\"indexed\":true,\"analyzed\":true,\"doc_values\":false},{\"name\":\"dc\",\"type\":\"string\",\"count\":0,\"scripted\":false,\"indexed\":true,\"analyzed\":true,\"doc_values\":false},{\"name\":\"tags\",\"type\":\"string\",\"count\":0,\"scripted\":false,\"indexed\":true,\"analyzed\":true,\"doc_values\":false},{\"name\":\"host\",\"type\":\"string\",\"count\":0,\"scripted\":false,\"indexed\":true,\"analyzed\":true,\"doc_values\":false},{\"name\":\"_index\",\"type\":\"string\",\"count\":0,\"scripted\":false,\"indexed\":false,\"analyzed\":false,\"doc_values\":false},{\"name\":\"elapsedTime\",\"type\":\"number\",\"count\":0,\"scripted\":false,\"indexed\":true,\"analyzed\":false,\"doc_values\":false},{\"name\":\"message\",\"type\":\"string\",\"count\":0,\"scripted\":false,\"indexed\":true,\"analyzed\":true,\"doc_values\":false},{\"name\":\"@timestamp\",\"type\":\"date\",\"count\":0,\"scripted\":false,\"indexed\":true,\"analyzed\":false,\"doc_values\":false},{\"name\":\"performanceRequest\",\"type\":\"string\",\"count\":0,\"scripted\":false,\"indexed\":true,\"analyzed\":true,\"doc_values\":false}]", "@version"=>"1", "@timestamp"=>"2016-01-22T11:49:35.268Z"}, :level=>:debug, :file=>"(eval)", :line=>"21", :method=>"output_func"}

解决方案

A simple way to do this is to use Logstash with an elasticsearch input plugin and an elasticsearch output plugin.

The benefit of this solution is that you don't have to rewrite boilerplate code to scan/scroll and bulk re-index which is exactly what Logstash already provides.

After installing Logstash, you can create a configuration file copy.conf that looks like this:

input {
  elasticsearch {
   hosts => ["localhost:9200"]                   <--- source ES host
   index => "source_index"
  }
}
filter {
 mutate {
  remove_field => [ "@version", "@timestamp" ]   <--- remove fields added by Logstash
 }
}
output {
 elasticsearch {
   hosts => ["localhost:9200"]                   <--- target ES host
   manage_template => false
   index => "target_index"
   document_id => "%{id}"                        <--- name of your ID field
   workers => 1
 }
}

And then after setting the correct values (source/target host + source/target index), you can run this with bin/logstash -f copy.conf

这篇关于在弹性搜索中将数据从一个索引复制到另一个的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆