有效地合并文件块 [英] Efficiently merge file chunks
问题描述
您好,我正在尝试将拆分的文件重新合并在一起.我有这个代码工作,但它是痛苦的缓慢.有什么想法可以加快速度吗?每个文件块最多可以包含 5mb 块,并且可以将 100 个或更多块重新组合在一起.我将块文件路径数组mergeFileNameArray"以及新合并文件的文件名"传递给它.
非常感谢!
<块引用> Dim STo As System.IO.Stream = System.IO.File.Open _(文档名称, _System.IO.FileMode.Create, _System.IO.FileAccess.Write)Dim BW As New System.IO.BinaryWriter(STo)对于合并文件名数组中的每个 FDim STi As System.IO.Stream = System.IO.File.Open _(F, _System.IO.FileMode.Open, _System.IO.FileAccess.Read)Dim BR As New System.IO.BinaryReader(STi)直到(BR.BaseStream.Position = BR.BaseStream.Length)BW.Write(BR.ReadByte)环形BR.Close()下一个BW.Close()
一次复制一个字节的文件效率并不高.相反,您可以使用缓冲区和 文件流s;最基本的:
选项推断进口系统.IO模块模块1Sub ConcatenateFileParts(fileParts As String(), destFile As String)使用 outStream 作为新的 FileStream(destFile, FileMode.Create, FileAccess.Write, FileShare.None)昏暗的缓冲区长度 = 32768Dim buffer(bufferLength - 1) As ByteDim 字节读为整数对于每个 inFile 在 fileParts使用 inStream 作为新的 FileStream(inFile, FileMode.Open, FileAccess.Read, FileShare.Read)bytesRead = inStream.Read(buffer, 0, bufferLength)虽然 bytesRead >0outStream.Write(buffer, 0, bytesRead)bytesRead = inStream.Read(buffer, 0, bufferLength)结束时间结束使用下一个结束使用结束子子主()' 我将几个文件放在C:\temp\subdir"中进行测试.Dim filesToMerge = Directory.GetFiles("C:\temp\subdir")Dim dest = "C:\temp\merged.txt"尝试ConcatenateFileParts(filesToMerge, dest)Catch ex 作为例外MsgBox("文件合并失败,因为" & ex.Message)结束尝试结束子终端模块
Using
结构确保文件流被正确关闭和处理.
您可以通过报告进度使其更加复杂,例如在复制每个文件部分后增加计数器.
Hi I'm trying to merge a split file back together. I have this code working but it is painfully slow. Any ideas how I can speed this up? Each file chunk can contain up to 5mb chunks and there can be 100 or more chunks to put back together. I'm passing it an array of the chunk file paths "mergeFileNameArray" and also the "filename" for the new merged file.
Many thanks!
Dim STo As System.IO.Stream = System.IO.File.Open _ (filename, _ System.IO.FileMode.Create, _ System.IO.FileAccess.Write) Dim BW As New System.IO.BinaryWriter(STo) For Each F In mergeFileNameArray Dim STi As System.IO.Stream = System.IO.File.Open _ (F, _ System.IO.FileMode.Open, _ System.IO.FileAccess.Read) Dim BR As New System.IO.BinaryReader(STi) Do Until (BR.BaseStream.Position = BR.BaseStream.Length) BW.Write(BR.ReadByte) Loop BR.Close() Next BW.Close()
Copying files one byte at a time is not as efficient as it can be. Instead, you can copy chunks of the files using a buffer and FileStreams; at it's most basic:
Option Infer On
Imports System.IO
Module Module1
Sub ConcatenateFileParts(fileParts As String(), destFile As String)
Using outStream As New FileStream(destFile, FileMode.Create, FileAccess.Write, FileShare.None)
Dim bufferLength = 32768
Dim buffer(bufferLength - 1) As Byte
Dim bytesRead As Integer
For Each inFile In fileParts
Using inStream As New FileStream(inFile, FileMode.Open, FileAccess.Read, FileShare.Read)
bytesRead = inStream.Read(buffer, 0, bufferLength)
While bytesRead > 0
outStream.Write(buffer, 0, bytesRead)
bytesRead = inStream.Read(buffer, 0, bufferLength)
End While
End Using
Next
End Using
End Sub
Sub Main()
' I put a couple of files in "C:\temp\subdir" for testing.
Dim filesToMerge = Directory.GetFiles("C:\temp\subdir")
Dim dest = "C:\temp\merged.txt"
Try
ConcatenateFileParts(filesToMerge, dest)
Catch ex As Exception
MsgBox("File merge failed because " & ex.Message)
End Try
End Sub
End Module
The Using
construct makes sure that the filestreams are closed and disposed of properly.
You could make it more sophisticated by reporting progress, for example incrementing a counter afer each file part is copied.
这篇关于有效地合并文件块的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!