Spyder中的简单Python多重处理功能不会输出结果 [英] Simple Python Multiprocessing function in Spyder doesn't output results
问题描述
我这里有一个非常简单的函数,我正在其中尝试运行和测试,但是它不输出任何东西,也没有任何错误. 我已经多次检查代码,但没有任何错误.
I have this very simple function right here in which I'm trying to run and test on, however, it doesn't output anything and it doesn't have any errors either. I've checked the code multiple times but it doesn't have any errors.
我打印了工作,这是我得到的:
I printed jobs and here's what I got:
[<Process(Process-12, stopped[1])>,
<Process(Process-13, stopped[1])>,
<Process(Process-14, stopped[1])>,
<Process(Process-15, stopped[1])>,
<Process(Process-16, stopped[1])>]
代码如下:
import multiprocessing
def worker(num):
print "worker ", num
return
jobs = []
for i in range(5):
p = multiprocessing.Process(target = worker, args = (i,))
jobs.append(p)
p.start()
这是我期望的结果,但未输出任何内容:
Here's the result I'm expecting but it's not outputting anything:
Worker: 0
Worker: 1
Worker: 2
Worker: 3
Worker: 4
推荐答案
注释显示OP和Spyder都使用Windows.由于Spyder重定向stdout
并且Windows不支持分叉,新的子进程将不会打印到Spyder控制台中.这完全是由于新子进程的stdout
是Python的原始标准输出,也可以在sys.__stdout__
中找到.
The comments revealed that OP uses Windows as well as Spyder. Since Spyder redirects stdout
and Windows does not support forking, a new child process won't print into the Spyder console. This is simply due to the fact that stdout
of the new child process is Python's vanilla stdout, which can also be found in sys.__stdout__
.
有两种选择:
-
使用记录模块.这将包括创建所有消息并将其记录到一个或几个文件中.使用单个日志文件可能会导致以下问题:由于进程将同时写入文件,因此输出会出现一些乱码.每个进程使用一个文件即可解决此问题.
Using the logging module. This would encompass creating and logging all messages to one or several files. Using a single log-file may lead to the problem that the output is slightly garbled since the processes would write concurrently to the file. Using a single file per process could solve this.
在子进程中不使用print
,而只是将结果返回给主进程.通过使用队列(或multiprocessing.Manager().Queue()
,因为分叉是不可能)或更简单地依靠 multiprocessing池的 map
功能,请参见下面的示例.
Not using print
within the child processes, but simply returning the result to the main process. Either by using a queue (or multiprocessing.Manager().Queue()
since forking is not possible) or more simply by relying on the multiprocessing Pool's map
functionality, see example below.
带有池的多处理示例:
import multiprocessing
def worker(num):
"""Returns the string of interest"""
return "worker %d" % num
def main():
pool = multiprocessing.Pool(4)
results = pool.map(worker, range(10))
pool.close()
pool.join()
for result in results:
# prints the result string in the main process
print(result)
if __name__ == '__main__':
# Better protect your main function when you use multiprocessing
main()
打印(在主过程中)
worker 0
worker 1
worker 2
worker 3
worker 4
worker 5
worker 6
worker 7
worker 8
worker 9
如果您不耐烦地等待map
函数完成,则可以使用imap_unordered
并稍微更改命令的顺序来立即打印结果:
If you are to impatient to wait for the map
function to finish, you can immediately print your results by using imap_unordered
and slightly changing the order of the commands:
def main():
pool = multiprocessing.Pool(4)
results = pool.imap_unordered(worker, range(10))
for result in results:
# prints the result string in the main process as soon as say are ready
# but results are now no longer in order!
print(result)
# The pool should join after printing all results
pool.close()
pool.join()
这篇关于Spyder中的简单Python多重处理功能不会输出结果的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!