java.lang.OutOfMemoryError:调用Files.readAllBytes时直接缓冲内存 [英] java.lang.OutOfMemoryError: Direct buffer memory when invoking Files.readAllBytes

查看:124
本文介绍了java.lang.OutOfMemoryError:调用Files.readAllBytes时直接缓冲内存的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有以下代码,该代码旨在读取目录并将其压缩到tar.gz归档文件中.当我将代码部署到服务器上并使用一批文件对其进行测试时,它可以在前几个测试批处理中运行,但是在第4或第5个批处理之后,它将开始一致地为我提供java.lang.OutOfMemoryError:即使直接缓冲内存文件批处理大小保持不变,并且堆空间看起来不错.这是代码:

I've got the following code which is designed to read a directory and compress it into a tar.gz archive. When I deploy the code onto the server and test it with a batch of files, it works on the first few test batches, but after the 4th or 5th batch, it starts consistently giving me java.lang.OutOfMemoryError: Direct buffer memory even though the file batch size stays the same and the heap space looks fine. Here's the code :

public static void compressDirectory(String archiveDirectoryToCompress) throws IOException {
Path archiveToCompress = Files.createFile(Paths.get(archiveDirectoryToCompress + ".tar.gz"));

try (GzipCompressorOutputStream gzipCompressorOutputStream = new GzipCompressorOutputStream(
    Files.newOutputStream(archiveToCompress));
     TarArchiveOutputStream tarArchiveOutputStream = new TarArchiveOutputStream(gzipCompressorOutputStream)) {
  Path directory = Paths.get(archiveDirectoryToCompress);
  Files.walk(directory)
      .filter(path -> !Files.isDirectory(path))
      .forEach(path -> {
        String
            stringPath =
            path.toAbsolutePath().toString().replace(directory.toAbsolutePath().toString(), "")
                .replace(path.getFileName().toString(), "");
        TarArchiveEntry tarEntry = new TarArchiveEntry(stringPath + "/" + path.getFileName().toString());
        try {
          byte[] bytes = Files.readAllBytes(path); //It throws the error at this point.
          tarEntry.setSize(bytes.length);
          tarArchiveOutputStream.putArchiveEntry(tarEntry);
          tarArchiveOutputStream.write(bytes);
          tarArchiveOutputStream.closeArchiveEntry();
        } catch (Exception e) {
          LOGGER.error("There was an error while compressing the files", e);
        }
      });
}

}

这是个例外:

Caused by: java.lang.OutOfMemoryError: Direct buffer memory
    at java.nio.Bits.reserveMemory(Bits.java:658)
    at java.nio.DirectByteBuffer.<init>(DirectByteBuffer.java:123)
    at java.nio.ByteBuffer.allocateDirect(ByteBuffer.java:311)
    at sun.nio.ch.Util.getTemporaryDirectBuffer(Util.java:174)
    at sun.nio.ch.IOUtil.read(IOUtil.java:195)
    at sun.nio.ch.FileChannelImpl.read(FileChannelImpl.java:158)
    at sun.nio.ch.ChannelInputStream.read(ChannelInputStream.java:65)
    at sun.nio.ch.ChannelInputStream.read(ChannelInputStream.java:109)
    at sun.nio.ch.ChannelInputStream.read(ChannelInputStream.java:103)
    at java.nio.file.Files.read(Files.java:3105)
    at java.nio.file.Files.readAllBytes(Files.java:3158)
    at com.ubs.gfs.etd.reporting.otc.trsloader.service.file.GmiEodFileArchiverService.lambda$compressDirectory$4(GmiEodFileArchiverService.java:124)
    at com.ubs.gfs.etd.reporting.otc.trsloader.service.file.GmiEodFileArchiverService$$Lambda$19/183444013.accept(Unknown Source)
    at java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:184)
    at java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:175)
    at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
    at java.util.Iterator.forEachRemaining(Iterator.java:116)
    at java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801)
    at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:512)
    at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:502)
    at java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:151)
    at java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:174)
    at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
    at java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:418)
    at com.ubs.gfs.etd.reporting.otc.trsloader.service.file.GmiEodFileArchiverService.compressDirectory(GmiEodFileArchiverService.java:117)
    at com.ubs.gfs.etd.reporting.otc.trsloader.service.file.GmiEodFileArchiverService.archiveFiles(GmiEodFileArchiverService.java:66)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:497)
    at org.springframework.expression.spel.support.ReflectiveMethodExecutor.execute(ReflectiveMethodExecutor.java:113)
    at org.springframework.expression.spel.ast.MethodReference.getValueInternal(MethodReference.java:102)
    at org.springframework.expression.spel.ast.MethodReference.access$000(MethodReference.java:49)
    at org.springframework.expression.spel.ast.MethodReference$MethodValueRef.getValue(MethodReference.java:347)
    at org.springframework.expression.spel.ast.CompoundExpression.getValueInternal(CompoundExpression.java:88)
    at org.springframework.expression.spel.ast.SpelNodeImpl.getTypedValue(SpelNodeImpl.java:131)
    at org.springframework.expression.spel.standard.SpelExpression.getValue(SpelExpression.java:330)
    at org.springframework.integration.util.AbstractExpressionEvaluator.evaluateExpression(AbstractExpressionEvaluator.java:166)
    at org.springframework.integration.util.MessagingMethodInvokerHelper.processInternal(MessagingMethodInvokerHelper.java:317)
    ... 93 more

我认为有一个缓冲区内存泄漏,因为它在前4个测试批次上都可以正常工作,但是随后始终给出java.lang.OutOfMemoryError:之后出现直接缓冲区内存错误,但是我不知道如何解决它.我在这里看到了使用Cleaner方法的潜在解决方案: http://www.java67.com/2014/01/how-to-fix-javalangoufofmemoryerror-direct-byte-buffer-java.html

I think there's a buffer memory leak as it works flawlessly on the first 4 test batches but then consistently gives a java.lang.OutOfMemoryError: Direct buffer memory error after but I have no clue how to fix it. I saw a potential solution using a Cleaner method here: http://www.java67.com/2014/01/how-to-fix-javalangoufofmemoryerror-direct-byte-buffer-java.html

但是我不知道这种情况是否适用.

But I don't know if that could apply in this case.

------------------------编辑---------------------- -

------------------------EDIT------------------------

我找到了另一种方法,如何使用IOUtils和缓冲的输入流对文件进行tar处理,并解决了该问题,即更新了代码:

I found another approach on how to tar the files using IOUtils and buffered input streams and that fixed the problem, updated code:

  public static void compressDirectory(String archiveDirectoryToCompress) throws IOException {
Path archiveToCompress = Files.createFile(Paths.get(archiveDirectoryToCompress + ".tar.gz"));

try (GzipCompressorOutputStream gzipCompressorOutputStream = new GzipCompressorOutputStream(
    Files.newOutputStream(archiveToCompress));
     TarArchiveOutputStream tarArchiveOutputStream = new TarArchiveOutputStream(gzipCompressorOutputStream)) {
  Path directory = Paths.get(archiveDirectoryToCompress);
  Files.walk(directory)
      .filter(path -> !Files.isDirectory(path))
      .forEach(path -> {
        TarArchiveEntry tarEntry = new TarArchiveEntry(path.toFile(),path.getFileName().toString());
        try (BufferedInputStream bufferedInputStream = new BufferedInputStream(new FileInputStream(path.toString()))) {
          tarArchiveOutputStream.putArchiveEntry(tarEntry);
          IOUtils.copy(bufferedInputStream, tarArchiveOutputStream);
          tarArchiveOutputStream.closeArchiveEntry();
        } catch (Exception e) {
          LOGGER.error("There was an error while compressing the files", e);
        }
      });
}

}

推荐答案

实际上,您只需调用file.length()即可获得文件大小.尝试更改从文件读取字节的方式:

Actually you can get file size just calling file.length(). Try change the way you are reading bytes from file:

tarArchiveOutputStream.write(IOUtils.toByteArray(new FileInputStream(path.toFile())));

来自apache commons IO包的类IOUtils( http://commons.apache.org/proper /commons-io/). 我认为这应该有助于解决您的麻烦.在某些情况下,建议使用@afretas.

Class IOUtils from apache commons IO package (http://commons.apache.org/proper/commons-io/). I think it should help resolve your trouble. In some cases suggestion of @afretas is useful.

这篇关于java.lang.OutOfMemoryError:调用Files.readAllBytes时直接缓冲内存的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆