Python 多处理标准输入输入 [英] Python multiprocessing stdin input

查看:38
本文介绍了Python 多处理标准输入输入的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

在 python 3.4 windows 7 上编写和测试的所有代码.

我正在设计一个控制台应用程序,需要使用命令行 (win os) 中的 stdin 来发出命令并更改程序的操作模式.该程序依赖于多处理来处理 CPU 绑定负载以分散到多个处理器.

我正在使用 stdout 来监控该状态和一些基本的返回信息,并使用 stdin 来发出命令以根据返回的控制台信息加载不同的子进程.

这是我发现问题的地方.我无法让多处理模块接受 stdin 输入,但 stdout 工作得很好.我认为在 stack 上找到了以下帮助所以我测试了它并发现使用线程模块一切都很好,除了由于 GIL 锁定和 stdin 阻塞,所有到 stdout 的输出都会暂停,直到每次 stdin 循环.

我会说我已经成功地通过 msvcrt.kbhit() 实现了一项工作.但是,我不禁想知道多处理功能中是否存在某种错误使 stdin 无法读取任何数据.我尝试了多种方法,但在使用多处理时没有任何效果.甚至尝试使用队列,但我没有尝试池或多处理中的任何其他方法.

我也没有在我的 linux 机器上尝试这个,因为我专注于让它工作.

以下是未按预期运行的简化测试代码(提醒这是用 Python 3.4 - win7 编写的):

导入系统导入时间从多处理导入过程定义函数1():为真:打印(功能1")时间.sleep(1.33)定义函数2():为真:打印(功能2")c = sys.stdin.read(1) # 在继续循环之前似乎没有等待读取.sys.stdout.write(c) #'c' 中没有任何内容sys.stdout.write(".") #检查它是否有效.print(str(c)) #尝试别的东西,在'c'中仍然什么都没有时间.sleep(1.66)如果 __name__ == "__main__":p1 = 进程(目标 = 函数 1)p2 = 进程(目标 = 函数 2)p1.start()p2.start()

希望有人可以阐明这是否是预期的功能,如果我没有正确实现它,或者其他一些有用的信息.

谢谢.

解决方案

当您查看 multiprocessing.Process._bootstrap() 你会看到:

如果 sys.stdin 不是 None:尝试:sys.stdin.close()sys.stdin = 打开(os.devnull)除了(OSError,ValueError):经过

您也可以使用以下方法确认:

<预><代码>>>>导入系统>>>导入多处理>>>定义函数():...打印(sys.stdin)...>>>p = multiprocessing.Process(target=func)>>>p.start()>>><_io.TextIOWrapper name='/dev/null' mode='r' encoding='UTF-8'>

并且从 os.devnull 读取立即返回空结果:

<预><代码>>>>导入操作系统>>>f = 打开(os.devnull)>>>f.read(1)''

您可以使用 open(0):

<块引用>

file 是一个字符串或字节对象,给出要打开的文件的路径名(绝对或相对于当前工作目录)或要打开的文件的整数文件描述符包裹.(如果给出了文件描述符,则在返回的 I/O 对象关闭时它也将关闭,除非将 closefd 设置为 False.)

"0 文件描述符:

<块引用>

文件描述符是与当前进程打开的文件相对应的小整数.例如,标准输入通常是文件描述符0,标准输出是1,标准错误是2:

<预><代码>>>>定义函数():... sys.stdin = 打开(0)...打印(sys.stdin)... c = sys.stdin.read(1)...打印('得到',c)...>>>multiprocessing.Process(target=func).start()>>><_io.TextIOWrapper name=0 mode='r' encoding='UTF-8'>有一个

All code written and tested on python 3.4 windows 7.

I was designing a console app and had a need to use stdin from command-line (win os) to issue commands and to change the operating mode of the program. The program depends on multiprocessing to deal with cpu bound loads to spread to multiple processors.

I am using stdout to monitor that status and some basic return information and stdin to issue commands to load different sub-processes based on the returned console information.

This is where I found a problem. I could no get the multiprocessing module to accept stdin inputs but stdout was working just fine. I think found the following help on stack So I tested it and found that with the threading module this all works great, except for the fact that all output to stdout is paused until each time stdin is cycled due to GIL lock with stdin blocking.

I will say I have been successful with a work around implemented with msvcrt.kbhit(). However, I can't help but wonder if there is some sort of bug in the multiprocessing feature that is making stdin not read any data. I tried numerous ways and nothing worked when using multiprocessing. Even attempted to use Queues, but I did not try pools, or any other methods from multiprocessing.

I also did not try this on my linux machine since I was focusing on trying to get it to work.

Here is simplified test code that does not function as intended (reminder this was written in Python 3.4 - win7):

import sys
import time
from multiprocessing import Process

def function1():
    while True:
        print("Function 1")
        time.sleep(1.33)

def function2():
    while True:
        print("Function 2")
        c = sys.stdin.read(1) # Does not appear to be waiting for read before continuing loop.
        sys.stdout.write(c) #nothing  in 'c'
        sys.stdout.write(".") #checking to see if it works at all.
        print(str(c)) #trying something else, still nothing in 'c'
        time.sleep(1.66)

if __name__ == "__main__":
    p1 = Process(target=function1)
    p2 = Process(target=function2)
    p1.start()
    p2.start()

Hopefully someone can shed light on whether this is intended functionality, if I didn't implement it correctly, or some other useful bit of information.

Thanks.

解决方案

When you take a look at Pythons implementation of multiprocessing.Process._bootstrap() you will see this:

if sys.stdin is not None:
    try:
        sys.stdin.close()
        sys.stdin = open(os.devnull)
    except (OSError, ValueError):
        pass

You can also confirm this by using:

>>> import sys
>>> import multiprocessing
>>> def func():
...     print(sys.stdin)
... 
>>> p = multiprocessing.Process(target=func)
>>> p.start()
>>> <_io.TextIOWrapper name='/dev/null' mode='r' encoding='UTF-8'>

And reading from os.devnull immediately returns empty result:

>>> import os
>>> f = open(os.devnull)
>>> f.read(1)
''

You can work this around by using open(0):

file is either a string or bytes object giving the pathname (absolute or relative to the current working directory) of the file to be opened or an integer file descriptor of the file to be wrapped. (If a file descriptor is given, it is closed when the returned I/O object is closed, unless closefd is set to False.)

And "0 file descriptor":

File descriptors are small integers corresponding to a file that has been opened by the current process. For example, standard input is usually file descriptor 0, standard output is 1, and standard error is 2:

>>> def func():
...     sys.stdin = open(0)
...     print(sys.stdin)
...     c = sys.stdin.read(1)
...     print('Got', c)
... 
>>> multiprocessing.Process(target=func).start()
>>> <_io.TextIOWrapper name=0 mode='r' encoding='UTF-8'>
Got a

这篇关于Python 多处理标准输入输入的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆