如何在 PowerShell 中迭代包含大量文件的文件夹? [英] How to iterate over a folder with a large number of files in PowerShell?
问题描述
我正在尝试编写一个脚本,该脚本将遍历文件夹中的 160 万个文件,并根据文件名将它们移动到正确的文件夹中.
I'm trying to write a script that would go through 1.6 million files in a folder and move them to the correct folder based on the file name.
原因是 NTFS 无法在不降低性能的情况下处理单个文件夹中的大量文件.
The reason is that NTFS can't handle a large number of files within a single folder without a degrade in performance.
脚本调用Get-ChildItem"来获取该文件夹中的所有项目,正如您所料,这会消耗大量内存(大约 3.8 GB).
The script call "Get-ChildItem" to get all the items within that folder, and as you might expect, this consumes a lot of memory (about 3.8 GB).
我很好奇是否有其他方法可以遍历目录中的所有文件而不会占用太多内存.
I'm curious if there are any other ways to iterate through all the files in a directory without using up so much memory.
推荐答案
如果你这样做
$files = Get-ChildItem $dirWithMillionsOfFiles
#Now, process with $files
您将面临记忆问题.
使用 PowerShell 管道处理文件:
Use PowerShell piping to process the files:
Get-ChildItem $dirWithMillionsOfFiles | %{
#process here
}
第二种方式将消耗更少的内存,理想情况下不应超过某个点.
The second way will consume less memory and should ideally not grow beyond a certain point.
这篇关于如何在 PowerShell 中迭代包含大量文件的文件夹?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!