在python上获取大目录的文件列表的最佳方法? [英] best way to get files list of big directory on python?
问题描述
在代码中,我需要获取迭代器,而不是列表。所以这不工作:
os.listdir
glob.glob(使用listdir!)
os。步行
我找不到任何好的lib。帮帮我!可能是c ++ lib?
如果你有一个目录对于libc readdir()来说太大了,想看看内核调用getdents()( http://www.kernel.org/doc /man-pages/online/pages/man2/getdents.2.html )。我遇到了一个类似的问题,写了一篇关于它的长篇博文。
http://www.olark.com/spw/2011/08/ you-can-list-a-directory-with-800万-files-but-not-with-ls /
基本上,readdir()一次只读32K的目录条目,所以如果目录中有很多文件,那么readdir()将需要很长时间才能完成。
I have insane big directory. I need to get filelist via python.
In code i need to get iterator, not list. So this not work:
os.listdir
glob.glob (uses listdir!)
os.walk
I cant find any good lib. help! Maybe c++ lib?
If you have a directory that is too big for libc readdir() to read it quickly, you probably want to look at the kernel call getdents() (http://www.kernel.org/doc/man-pages/online/pages/man2/getdents.2.html ). I ran into a similar problem and wrote a long blog post about it.
http://www.olark.com/spw/2011/08/you-can-list-a-directory-with-8-million-files-but-not-with-ls/
Basically, readdir() only reads 32K of directory entries at a time, and so if you have a lot of files in a directory, readdir() will take a very long time to complete.
这篇关于在python上获取大目录的文件列表的最佳方法?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!