用C#计算大文件的MD5SUM [英] Computing MD5SUM of large files in C#
问题描述
我正在使用以下代码来计算文件的 MD5SUM -
byte[] b = System.IO.File.ReadAllBytes(file);字符串总和 = BitConverter.ToString(new MD5CryptoServiceProvider().ComputeHash(b));
这正常工作正常,但如果我遇到大文件(~1GB) - 例如iso 映像或 DVD VOB 文件 - 我收到内存不足异常.
不过,我可以在大约 10 秒内在 cygwin 中为同一个文件计算 MD5SUM.
请建议我如何才能让它对我的程序中的大文件起作用.
谢谢
我建议使用替代方法:
MD5CryptoServiceProvider.ComputeHash(Stream)
并传入在您的文件中打开的输入流.这种方法几乎肯定不会一口气读入内存中的整个文件.
我还注意到,在 MD5 的大多数实现中,可以将 byte[]
数据一次一个块地添加到摘要函数中,然后在最后请求散列.>
I am using following code to compute MD5SUM of a file -
byte[] b = System.IO.File.ReadAllBytes(file);
string sum = BitConverter.ToString(new MD5CryptoServiceProvider().ComputeHash(b));
This works fine normally, but if I encounter a large file (~1GB) - e.g. an iso image or a DVD VOB file - I get an Out of Memory exception.
Though, I am able to compute the MD5SUM in cygwin for the same file in about 10secs.
Please suggest how can I get this to work for big files in my program.
Thanks
I suggest using the alternate method:
MD5CryptoServiceProvider.ComputeHash(Stream)
and just pass in an input stream opened on your file. This method will almost certainly not read in the whole file in memory in one go.
I would also note that in most implementations of MD5 it's possible to add byte[]
data into the digest function a chunk at a time, and then ask for the hash at the end.
这篇关于用C#计算大文件的MD5SUM的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!