Apache的骆驼ZipInputStream封闭,并行处理 [英] Apache Camel ZipInputStream closed with parallel processing
问题描述
我成功地使用了 ZipSplitter()
来处理压缩文件中的文件。我想如果可能使用并行处理,但流呼吁 parallelProcessing()
结果被pmaturely收$ P $。这导致的结果是 IOException异常
当流是由 DefaultStreamCachingStrategy
缓存。
I am successfully using ZipSplitter()
to process files inside a zip file. I would like to use parallel processing if possible, but calling parallelProcessing()
results in the stream being closed prematurely. This results results in an IOException
when the stream is being cached by DefaultStreamCachingStrategy
.
我注意到,当启用并行处理, ZipIterator#checkNullAnswer(消息)
称为其关闭 ZipInputStream
。奇怪的是,一切都饰面如果我游荡该方法在我的调试,这表明处理已完成之前迭代被关闭。这是一个错误还是我搞砸了什么?
I note that when parallel processing is enabled, ZipIterator#checkNullAnswer(Message)
is called which closes the ZipInputStream
. Curiously, everything is dandy if I loiter on this method in my debugger, which suggests that the iterator is being closed before processing has completed. Is this a bug or have I messed up something?
我的路线的简化版本,它表现出这种行为是:
A simplified version of my route which exhibits this behaviour is:
from("file:myDirectory").
split(new ZipSplitter()).streaming().parallelProcessing().
log("Validating filename ${file:name}").
end();
这是用骆驼2.13.1。
This is using Camel 2.13.1.
推荐答案
你能只是尝试应用的 CAMEL-7415 到骆驼2.13.1分支?结果
我不辞职肯定是否能够解决您的问题,但它是值得给它一个镜头。
Can you just try to apply the CAMEL-7415 into the camel 2.13.1 branch?
I'm not quit sure if it can fix your issue, but it is worth to give it a shot.
这篇关于Apache的骆驼ZipInputStream封闭,并行处理的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!