大型数据集的 Camel Sql 消费者性能 [英] Camel Sql Consumer Performance for Large DataSets

查看:27
本文介绍了大型数据集的 Camel Sql 消费者性能的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我试图在 Ignite 缓存中缓存一些静态数据,以便更快地查询,因此我需要从 DataBase 读取数据以便将它们插入缓存集群.

I am trying to cache some static data in Ignite cache in order to query faster so I need to read the data from DataBase in order to insert them into cache cluster.

但行数大约为 300 万,通常会导致 OutOfMemory 错误,因为 SqlComponent 试图将所有数据作为一个整体处理,并尝试一劳永逸地收集它们.

But number of rows is like 3 million and it causes OutOfMemory error normally because SqlComponent is trying to process all the data as one and it tries to collect them once and for all.

在读取结果集时有什么方法可以拆分它们(例如每个 Exchange 1000 个项目)?

Is there any way to split them when reading result set (for ex 1000 items per Exchange)?

推荐答案

您可以根据您使用的 SQL 数据库在 SQL 查询中添加限制.

You can add a limit in the SQL query depending on what SQL database you use.

或者您可以尝试使用 jdbcTemplate.maxRows=1000 来使用该选项.但它是否支持限制使用该选项取决于 JDBC 驱动程序.

Or you can try using jdbcTemplate.maxRows=1000 to use that option. But it depends on the JDBC driver if it supports limiting using that option or not.

还要注意,您需要某种方式在处理后标记/删除行,以便在下一个查询中不会选择它们,例如使用 onConsume 选项.

And also mind you need some way to mark/delete the rows after processing, so they are not selected in the next query, such as using the onConsume option.

您可以查看单元测试以找到一些使用 onConsume 等的示例:https://github.com/apache/camel/tree/master/components/camel-sql/src/test

You can look in the unit tests to find some examples with onConsume etc: https://github.com/apache/camel/tree/master/components/camel-sql/src/test

这篇关于大型数据集的 Camel Sql 消费者性能的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆