是否有一种优雅的方式来处理块中的流? [英] Is there an elegant way to process a stream in chunks?

查看:113
本文介绍了是否有一种优雅的方式来处理块中的流?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我的确切场景是批量插入数据库,所以我想累积DOM对象然后每1000个,刷新它们。

My exact scenario is inserting data to database in batches, so I want to accumulate DOM objects then every 1000, flush them.

我通过放入代码来实现它累加器检测充满然后刷新,但这似乎是错误的 - 刷新控件应该来自调用者。

I implemented it by putting code in the accumulator to detect fullness then flush, but that seems wrong - the flush control should come from the caller.

我可以将流转换为List然后使用subList迭代时尚,但这似乎也很笨拙。

I could convert the stream to a List then use subList in an iterative fashion, but that too seems clunky.

有一个巧妙的方法来处理每n个元素然后继续流,而只处理流一次?

It there a neat way to take action every n elements then continue with the stream while only processing the stream once?

推荐答案

优雅在旁观者的眼中。如果你不介意在 groupingBy 中使用有状态函数,你可以这样做:

Elegance is in the eye of the beholder. If you don't mind using a stateful function in groupingBy, you can do this:

AtomicInteger counter = new AtomicInteger();

stream.collect(groupingBy(x->counter.getAndIncrement()/chunkSize))
    .values()
    .forEach(database::flushChunk);

这不会赢得原始解决方案的任何性能或内存使用点,因为它仍将实现在做任何事情之前整个流。

This doesn't win any performance or memory usage points over your original solution because it will still materialize the entire stream before doing anything.

如果你想避免实现列表,流API将无法帮助你。你必须得到流的迭代器或分裂器,并做这样的事情:

If you want to avoid materializing the list, stream API will not help you. You will have to get the stream's iterator or spliterator and do something like this:

Spliterator<Integer> split = stream.spliterator();
int chunkSize = 1000;

while(true) {
    List<Integer> chunk = new ArrayList<>(size);
    for (int i = 0; i < chunkSize && split.tryAdvance(chunk::add); i++){};
    if (chunk.isEmpty()) break;
    database.flushChunk(chunk);
}

这篇关于是否有一种优雅的方式来处理块中的流?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆