有没有一种优雅的方式来分块处理流?

2022-01-22 00:00:00 chunking java-8 java java-stream

我的确切场景是批量向数据库插入数据,所以我想累积 DOM 对象,然后每 1000 个,刷新它们.

My exact scenario is inserting data to database in batches, so I want to accumulate DOM objects then every 1000, flush them.

我通过将代码放入累加器中以检测填充度然后刷新来实现它,但这似乎是错误的 - 刷新控制应该来自调用者.

I implemented it by putting code in the accumulator to detect fullness then flush, but that seems wrong - the flush control should come from the caller.

我可以将流转换为 List,然后以迭代方式使用 subList,但这似乎也很笨重.

I could convert the stream to a List then use subList in an iterative fashion, but that too seems clunky.

是否有一种巧妙的方法可以对每 n 个元素采取行动,然后继续处理流,同时只处理一次流?

It there a neat way to take action every n elements then continue with the stream while only processing the stream once?

推荐答案

优雅在情人眼中.如果你不介意在 groupingBy 中使用有状态函数,你可以这样做:

Elegance is in the eye of the beholder. If you don't mind using a stateful function in groupingBy, you can do this:

AtomicInteger counter = new AtomicInteger();

stream.collect(groupingBy(x->counter.getAndIncrement()/chunkSize))
    .values()
    .forEach(database::flushChunk);

这不会比您的原始解决方案赢得任何性能或内存使用点,因为它仍然会在执行任何操作之前实现整个流.

This doesn't win any performance or memory usage points over your original solution because it will still materialize the entire stream before doing anything.

如果您想避免具体化列表,流 API 将无济于事.您将必须获取流的迭代器或拆分器并执行以下操作:

If you want to avoid materializing the list, stream API will not help you. You will have to get the stream's iterator or spliterator and do something like this:

Spliterator<Integer> split = stream.spliterator();
int chunkSize = 1000;

while(true) {
    List<Integer> chunk = new ArrayList<>(size);
    for (int i = 0; i < chunkSize && split.tryAdvance(chunk::add); i++){};
    if (chunk.isEmpty()) break;
    database.flushChunk(chunk);
}

相关文章