大型数据集的Camel Sql消费者性能 [英] Camel Sql Consumer Performance for Large DataSets

查看:277
本文介绍了大型数据集的Camel Sql消费者性能的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我试图在Ignite缓存中缓存一些静态数据以便更快地查询,因此我需要从数据库读取数据以将其插入缓存集群中.

I am trying to cache some static data in Ignite cache in order to query faster so I need to read the data from DataBase in order to insert them into cache cluster.

但是行数大约是3百万,通常会导致OutOfMemory错误,因为SqlComponent试图将所有数据作为一个处理,并且它会一劳永逸地收集它们.

But number of rows is like 3 million and it causes OutOfMemory error normally because SqlComponent is trying to process all the data as one and it tries to collect them once and for all.

读取结果集时有什么方法可以拆分它们(每个交易所有1000个项目)?

Is there any way to split them when reading result set (for ex 1000 items per Exchange)?

推荐答案

您可以根据所使用的SQL数据库在SQL查询中添加限制.

You can add a limit in the SQL query depending on what SQL database you use.

或者您可以尝试使用jdbcTemplate.maxRows=1000来使用该选项.但是,是否支持使用该选项进行限制取决于JDBC驱动程序.

Or you can try using jdbcTemplate.maxRows=1000 to use that option. But it depends on the JDBC driver if it supports limiting using that option or not.

另外请注意,您需要在处理后标记/删除行,因此在下一个查询中不会选择它们,例如使用onConsume选项.

And also mind you need some way to mark/delete the rows after processing, so they are not selected in the next query, such as using the onConsume option.

您可以查看单元测试以找到带有onConsume等的一些示例: https://github.com/apache/camel/tree/master/components/camel-sql/src/test

You can look in the unit tests to find some examples with onConsume etc: https://github.com/apache/camel/tree/master/components/camel-sql/src/test

这篇关于大型数据集的Camel Sql消费者性能的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆