通过python将弹性数据转储到csv或任何NOSQL中 [英] Dumping elastic data into csv or into any NOSQL through python

查看:85
本文介绍了通过python将弹性数据转储到csv或任何NOSQL中的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我们知道,由于连接错误问题,我们无法通过弹性搜索从python中获取超过10000行。我想从我的弹性簇中获取两个小时的数据,并且每5分钟要进行10000次观测。

As we know we can't fetch more than 10000 rows in python from elastic search because of connection error issue. I want data for two hours from my elastic clusterand for every 5 minutes, I am having approx 10000 observation.

1。)有什么办法可以将弹性搜索中的数据直接转储到csv或计数超过10000的某些Nosql db中。

1.) Is there is any way if I can just dump the data from elastic search directly into csv or into some Nosql db with more than 10000 count.

我用python编写代码。

I writing my code in python.

我有Elasticsearch版本5

推荐答案

尝试以下代码进行滚动查询

Try the below code for scroll query

from elasticsearch import Elasticsearch, helpers

    es = Elasticsearch()
    es_index = "your_index_name"
    documento = "your_doc_type"


    body = {
            "query": {
            "term" : { "user" : user } 
                 }
            }

    res = helpers.scan(
                    client = es,
                    scroll = '2m',
                    query = body, 
                    index = es_index)

这篇关于通过python将弹性数据转储到csv或任何NOSQL中的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆