如何并行运行特定功能的Bash? [英] How to run given function in Bash in parallel?
问题描述
有过一些类似的问题,但我的问题不是并行运行几个程序 - 这可以用将平凡进行并行
或 xargs的
。
我需要并行Bash函数。
让我们想象一下code是这样的:
为我在$ {列表[@]}
做
在J$ {其他[@]}
做
#一些处理这里 - 20-30线几乎是纯的bash
DONE
DONE
一些的处理的需要外部程序的调用。
我想运行一些(4-10)任务,每个任务运行不同的 $ I
。在$列表中的元素总数为> 500
我知道我可以把整个在外部脚本Ĵ...完成
循环,并调用这个程序并行,但有可能不分裂做两个独立的程序之间的功能?
编辑:请考虑 OLE的答案,而不是 <。 / p>
而不是一个单独的脚本,你可以把你的code在一个单独的bash函数。然后,您可以将其导出,并通过xargs的运行它:
#!/斌/庆典
做工作() {
睡眠$((RANDOM%10 + 1))
回声处理I = $ 1,J = $ 2
}
出口-f的DoWork因为我在$ {列表[@]}
做
在J$ {其他[@]}
做
printf的%S \\ 0%S \\ 0$ I,$ J
DONE
做| xargs的-0 -n 2 -P 4的bash -c'DoWork的$ @' -
There have been some similar questions, but my problem is not "run several programs in parallel" - which can be trivially done with parallel
or xargs
.
I need to parallelize Bash functions.
Let's imagine code like this:
for i in "${list[@]}"
do
for j in "${other[@]}"
do
# some processing in here - 20-30 lines of almost pure bash
done
done
Some of the processing requires calls to external programs.
I'd like to run some (4-10) tasks, each running for different $i
. Total number of elements in $list is > 500.
I know I can put the whole for j ... done
loop in external script, and just call this program in parallel, but is it possible to do without splitting the functionality between two separate programs?
Edit: Please consider Ole's answer instead.
Instead of a separate script, you can put your code in a separate bash function. You can then export it, and run it via xargs:
#!/bin/bash
dowork() {
sleep $((RANDOM % 10 + 1))
echo "Processing i=$1, j=$2"
}
export -f dowork
for i in "${list[@]}"
do
for j in "${other[@]}"
do
printf "%s\0%s\0" "$i" "$j"
done
done | xargs -0 -n 2 -P 4 bash -c 'dowork "$@"' --
这篇关于如何并行运行特定功能的Bash?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!