重击而循环+减慢 [英] Bash while loop + cut slow
本文介绍了重击而循环+减慢的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!
问题描述
我正在尝试使用bash循环处理文件(1.5GB),以迭代每一行.我使用cut
是为了使其简单(相对),最后得到:
I am trying to process a file (1.5GB) with a bash loop to iterate each line. I used cut
for its simplicity (relative) and ended up with:
while read line
do
echo "$(echo $line | cut -d' ' -f 2-3)" "$(echo $line | cut -d'"' -f 20)"
done < TEST.log > IDS.log
这是非常慢的,仅约2KB/秒.我需要运行得更快的东西.
This is very slow and only does about 2KB/sec. I need something to run a lot faster.
还有,这里的瓶颈是什么?
Also, what is the bottleneck here?
推荐答案
瓶颈可能是您为每一行数据生成了多个进程.至于替换,此awk应该等效:
The bottleneck is likely that you spawn several processes for every line of data. As for a replacement, this awk should be equivalent:
awk '{ split($0, a, "\""); print $2, $3, a[20] }' TEST.log > IDS.log
这篇关于重击而循环+减慢的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!
查看全文