Neo4j&Spring Data Neo4j 4.0.0:导入大型数据集 [英] Neo4j & Spring Data Neo4j 4.0.0 : Importing large datasets

查看:86
本文介绍了Neo4j&Spring Data Neo4j 4.0.0:导入大型数据集的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想通过Spring Data Neo4j 4.0.0将实时日志记录数据插入Neo4j 2.2.1.日志记录数据非常大,可能会达到数十万条记录.如何实现这种功能的最佳方法是什么?在所有节点实体对象创建结束时仅使用.save(Iterable)方法是否安全?Spring Data Neo4j 4.0.0中是否有类似批量插入机制的东西?预先感谢!

I want to insert real-time logging data into Neo4j 2.2.1 through Spring Data Neo4j 4.0.0. The logging data is very big which may reach hundreds of thousands records. How is the best way to implement this kind of functionality? Is it safe to just using the .save(Iterable) method at the end of all the node entity objects creation? Is there something like batch insertion mechanism in Spring Data Neo4j 4.0.0? Thanks in advance!

推荐答案

由于SDN4可以直接与现有数据库一起使用,因此您可以使用neo4j-import进行初始导入.

As SDN4 can work with existing databases directly you can use neo4j-import for initial imports.

从Neo4j 2.2开始.我们还可以承受参数化密码的高并发写入负载,我认为您应该能够使用SDN4多线程向Neo4j添加数据.IE.假设每批创建1000到10k个对象并将其发送出去.

From Neo4j 2.2. we can also sustain highly concurrent write loads of parametrized cypher, I think you should be able to just multi-thread adding data to Neo4j using SDN4. I.e. create let's say 1000 to 10k objects per batch and send them off.

否则,您可以直接将参数化Cypher同时发送到Neo4j.

Otherwise you can just send parametrized Cypher directly concurrently to Neo4j.

这篇关于Neo4j&Spring Data Neo4j 4.0.0:导入大型数据集的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆