找到重复的文件 [英] find duplicated files
本文介绍了找到重复的文件的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!
问题描述
嗨每一个
i我正在尝试制作找到目录中所有重复文件的项目
第一步我想得到这个目录及其子目录中的所有文件
所以我试过这段代码
hi every one
i am trying to make project that find all duplicated files in directory
first step i want to get all files in this directory and its subdirectories
so i tried this code
List<string> enumeratedFiles = Directory.EnumerateFiles(@"d:\", "*.*", SearchOption.AllDirectories)
.Where(str => str.Contains(".")).AsParallel().ToList();
但我收到此错误
访问路径'd:\ System Volume Information \\ \\'被拒绝。
i知道我无法访问此目录
但是有没有办法跳过这个目录并继续
感谢所有
but i get this error
Access to the path 'd:\System Volume Information\' is denied.
i Know that i can't Access to this directory
but is there and way to skip this directory and continue
thanks for all
推荐答案
我知道没有内置功能来处理访问被拒绝的异常。你必须手动进行递归。
可以找到一个很好的例子http://stackoverflow.com/questions/172544/ignore-folders-files-when-directory-getfiles-is-denied -access:
As I know there is no built-in feature to handle access denied exceptions. You will have to do the recursion manually.
A good example can be found http://stackoverflow.com/questions/172544/ignore-folders-files-when-directory-getfiles-is-denied-access:
using System;
using System.IO;
static class Program
{
static void Main()
{
string path = ""; // TODO
ApplyAllFiles(path, ProcessFile);
}
static void ProcessFile(string path) {/* ... */}
static void ApplyAllFiles(string folder, Action<string> fileAction)
{
foreach (string file in Directory.GetFiles(folder))
{
fileAction(file);
}
foreach (string subDir in Directory.GetDirectories(folder))
{
try
{
ApplyAllFiles(subDir, fileAction);
}
catch
{
// swallow, log, whatever
}
}
}
}
您必须对其进行优化以匹配您的过滤。
You will have to refine this to match your filtering.
这篇关于找到重复的文件的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!
查看全文