Java的:高效转换多头的数组字节数组 [英] Java: Efficiently converting an array of longs to an array of bytes

查看:134
本文介绍了Java的:高效转换多头的数组字节数组的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

多头阵列我要写入磁盘。最有效的磁盘I / O功能采取的字节数组,例如:

I have an array of longs I want to write to disk. The most efficient disk I/O functions take in byte arrays, for example:

FileOutputStream.write(byte[] b, int offset, int length)

...所以我想通过我的转换长[] 开始的byte [] (8字节每个)。我努力寻找一个干净的方式来做到这一点。

...so I want to begin by converting my long[] to byte[] (8 bytes for each long). I'm struggling to find a clean way to do this.

直接类型转换似乎并不不允许的:

Direct typecasting doesn't seem allowed:

ConversionTest.java:6: inconvertible types
found   : long[]
required: byte[]
    byte[] byteArray = (byte[]) longArray;
                            ^

这很容易通过遍历数组在做转换,例如:

It's easy to do the conversion by iterating over the array, for example:

ByteBuffer bytes = ByteBuffer.allocate(longArray.length * (Long.SIZE/8));
for( long l: longArray )
{
    bytes.putLong( l );
}
byte[] byteArray = bytes.array();

...但是这似乎不是简单地处理效率远不如长[]作为一系列字节。

...however that seems far less efficient than simply treating the long[] as a series of bytes.

有趣的是,当的阅读的文件,可以很容易地从中投的byte [] 来使用缓冲器多头:

Interestingly, when reading the file, it's easy to "cast" from byte[] to longs using Buffers:

LongBuffer longs = ByteBuffer.wrap(byteArray).asLongBuffer();

...但我似乎无法找到任何功能去相反的方向。

...but I can't seem to find any functionality to go the opposite direction.

从转换时,我知道有尾数考虑字节,但我相信我已经解决这些:我使用上面显示的缓冲架构,默认为大端,无论本地字节顺序

I understand there are endian considerations when converting from long to byte, but I believe I've already addressed those: I'm using the Buffer framework shown above, which defaults to big endian, regardless of native byte order.

推荐答案

<子>关于效率,很多细节,其实很难有所作为。硬盘是目前这里所涉及的最慢的部分,并且在它需要一个字节写入磁盘的时候,你可能会转换数千甚至数百万字节为多头。每一个性能测试在这里不会告诉你任何有关的实施的,但有关的硬盘的性能表现。如有疑问,应使专用基准分别比较不同的变换策略,并比较不同的写作方法。

Concerning the efficiency, many details will, in fact, hardly make a difference. The hard disk is by far the slowest part involved here, and in the time that it takes to write a single byte to the disk, you could have converted thousands or even millions of bytes to longs. Every performance test here will not tell you anything about the performance of the implementation, but about the performance of the hard disk. In doubt, one should make dedicated benchmarks comparing the different conversion strategies, and comparing the different writing methods, respectively.

假设主要目标是一个功能,允许一个方便的转换和不施加不必要的开销,我想提出以下建议做法:

Assuming that the main goal is a functionality that allows a convenient conversion and does not impose an unnecessary overhead, I'd like to propose the following approach:

我们可以创建一个的ByteBuffer 足够的尺寸,认为这是一个 LongBuffer ,使用批量的 LongBuffer#放(长[]) 方法(它负责字节序转换的必要的,这是否有效率,因为它可以),最后,写了原始的的ByteBuffer (这是现在充满了值)的文件中,使用 FileChannel

One can create a ByteBuffer of sufficient size, view this as a LongBuffer, use the bulk LongBuffer#put(long[]) method (which takes care of endianness conversions, of necessary, and does this as efficient as it can be), and finally, write the original ByteBuffer (which is now filled with the long values) to the file, using a FileChannel.

根据这一想法,我认为这种方法方便,(最有可能),而有效的:

Following this idea, I think that this method is convenient and (most likely) rather efficient:

private static void bulkAndChannel(String fileName, long longArray[]) 
{
    ByteBuffer bytes = 
        ByteBuffer.allocate(longArray.length * Long.BYTES);
    bytes.order(ByteOrder.nativeOrder()).asLongBuffer().put(longArray);
    try (FileOutputStream fos = new FileOutputStream(fileName))
    {
        fos.getChannel().write(bytes);
    }
    catch (IOException e)
    {
        e.printStackTrace();
    }
}

(当然,有人会说关于分配一个大缓冲器是否是个好主意,但多亏了缓存类的简便方法,这很容易并用合理的努力进行修改,写入数据的块用适当的大小,对于一个人真正想要写一个的巨大的阵列并创建相应的的ByteBuffer 将大得惊人)

(Of course, one could argue about whether allocating a "large" buffer is the best idea. But thanks to the convenience methods of the Buffer classes, this could easily and with reasonable effort be modified to write "chunks" of data with an appropriate size, for the case that one really wants to write a huge array and the memory overhead of creating the corresponding ByteBuffer would be prohibitively large)

这篇关于Java的:高效转换多头的数组字节数组的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆