用Python生成一个包含多个文件的MD5 / SHA1校验和 [英] Generating one MD5/SHA1 checksum of multiple files in Python
问题描述
我已经浏览了几个关于计算Python中文件校验和的主题,但是他们都没有回答关于多个文件总和的问题。我在子目录中有几个文件,并且想要确定它们中的一个或多个是否有任何更改。
有没有办法从多个文件中生成一笔款项?
编辑:
这是我这样做得到一个总和列表:
checksums = [(fname,hashlib.md5(open(fname,'rb')。read())。digest ())for fname in flist]
它:)这样一个哈希总和生成的文件列表。
hash_obj = hashlib.md5(open(flist [0 ()),'rb')。read())
for fname in flist [1:]:
hash_obj.update(open(fname,'rb')。read())
checksum = hash_obj.digest()
谢谢PM 2Ring的输入!
请注意,md5已被破解,因此只能用于非安全关键目的。
I have looked through several topics about calculating checksums of files in Python but none of them answered the question about one sum from multiple files. I have several files in sub directories and would like to determine if there was any change in one or more of them. Is there a way to generate one sum from multiple files?
EDIT: This is the way I do it to get a list of sums:
checksums = [(fname, hashlib.md5(open(fname, 'rb').read()).digest()) for fname in flist]
So I made it :) This way one hash sum is generated for a file list.
hash_obj = hashlib.md5(open(flist[0], 'rb').read())
for fname in flist[1:]:
hash_obj.update(open(fname, 'rb').read())
checksum = hash_obj.digest()
Thank you PM 2Ring for your input!
Note that md5 has been cracked so use it only for non security critical purposes.
这篇关于用Python生成一个包含多个文件的MD5 / SHA1校验和的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!