Python:多核处理? [英] Python: Multicore processing?

查看:104
本文介绍了Python:多核处理?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我一直在阅读有关Python的多重处理模块的信息.我仍然认为我对它可以做什么没有很好的了解.

I've been reading about Python's multiprocessing module. I still don't think I have a very good understanding of what it can do.

比方说,我有一个四核处理器,并且我有一个包含1000000个整数的列表,我想要所有整数的总和.我可以简单地做到:

Let's say I have a quadcore processor and I have a list with 1,000,000 integers and I want the sum of all the integers. I could simply do:

list_sum = sum(my_list)

但这只会将其发送到一个内核.

But this only sends it to one core.

是否可以使用多处理模块将数组划分为每个核,然后让每个核获取其部分的总和并返回值,以便可以计算总和?

Is it possible, using the multiprocessing module, to divide the array up and have each core get the sum of it's part and return the value so the total sum may be computed?

类似的东西:

core1_sum = sum(my_list[0:500000])          #goes to core 1
core2_sum = sum(my_list[500001:1000000])    #goes to core 2
all_core_sum = core1_sum + core2_sum        #core 3 does final computation

任何帮助将不胜感激.

推荐答案

是的,有可能对多个进程进行这种求和,非常类似于对多个线程进行求和:

Yes, it's possible to do this summation over several processes, very much like doing it with multiple threads:

from multiprocessing import Process, Queue

def do_sum(q,l):
    q.put(sum(l))

def main():
    my_list = range(1000000)

    q = Queue()

    p1 = Process(target=do_sum, args=(q,my_list[:500000]))
    p2 = Process(target=do_sum, args=(q,my_list[500000:]))
    p1.start()
    p2.start()
    r1 = q.get()
    r2 = q.get()
    print r1+r2

if __name__=='__main__':
    main()

但是,与在单个进程中进行处理相比,使用多个进程进行处理的速度可能会更慢,因为来回复制数据要比立即求和要昂贵.

However, it is likely that doing it with multiple processes is likely slower than doing it in a single process, as copying the data forth and back is more expensive than summing them right away.

这篇关于Python:多核处理?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆