Python多处理,ValueError:关闭文件的I / O操作 [英] Python multiprocessing, ValueError: I/O operation on closed file

查看:732
本文介绍了Python多处理,ValueError:关闭文件的I / O操作的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我在Python多处理程序包中遇到问题。

 导入多处理器为mp 
导入时间

def test_file(f):
f.write(Testing ... \\\

print f.name
return None

if __name__ ==__main__:
f = open(test.txt,'w')
proc = mp.Process(target = test_file,args = [f])
proc。 start()
proc.join()

当我运行这个时,错误。

$ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ :\Python27\lib\multiprocessing\process.py,第258行,在_bootstrap
self.run()
文件C:\Python27\\lib\multiprocessing\ process.py,行114,运行
self.target(* self._args,** self._kwargs)
文件C:\ Users \Ray\Google Drive\Programming \Python\tests\follow_test.py,第24行,在test_file
f.write(Testing .. .\\\

ValueError:关闭文件的I / O操作
按任意键继续。 。 。

在创建新进程的过程中,文件句柄似乎有点失落。有人可以请解释发生了什么事吗?

解决方案

过去也有类似的问题。不知道是否在多处理模块中完成,或者默认情况下 open 是否设置了close-on-exec标志,但我确定在主进程中打开的文件句柄是



显而易见的解决方法是将文件名作为参数传递给子进程的init函数并将其打开一次在每个孩子中(如果使用一个池),或者将其作为参数传递给目标函数,并在每次调用时打开/关闭。前者需要使用全局来存储文件句柄(不是一件好事) - 除非有人能告诉我如何避免这种情况:)而后者可能会导致性能下降(但可用于多处理。直接进程)。
$ b

前者的示例:

  filehandle = None 

def child_init(文件名):
全局文件句柄
filehandle =打开(文件名,...)
../ ..

def child_target(args):
../ ..

if __name__ =='__main__':
#定义文件名的一些代码
proc = multiprocessing.Pool (processes = 1,initializer = child_init,initargs = [filename])
proc.apply(child_target,args)


I'm having a problem with the Python multiprocessing package. Below is a simple example code that illustrates my problem.

import multiprocessing as mp
import time

def test_file(f):
  f.write("Testing...\n")
  print f.name
  return None

if __name__ == "__main__":
  f = open("test.txt", 'w')
  proc = mp.Process(target=test_file, args=[f])
  proc.start()
  proc.join()

When I run this, I get the following error.

Process Process-1:
Traceback (most recent call last):
  File "C:\Python27\lib\multiprocessing\process.py", line 258, in _bootstrap
    self.run()
  File "C:\Python27\lib\multiprocessing\process.py", line 114, in run
    self.target(*self._args, **self._kwargs)
  File "C:\Users\Ray\Google Drive\Programming\Python\tests\follow_test.py", line 24, in test_file
    f.write("Testing...\n")
ValueError: I/O operation on closed file
Press any key to continue . . .

It seems that somehow the file handle is 'lost' during the creation of the new process. Could someone please explain what's going on?

解决方案

I had similar issues in the past. Not sure whether it is done within the multiprocessing module or whether open sets the close-on-exec flag by default but I know for sure that file handles opened in the main process are closed in the multiprocessing children.

The obvious work around is to pass the filename as a parameter to the child process' init function and open it once within each child (if using a pool), or to pass it as a parameter to the target function and open/close on each invocation. The former requires the use of a global to store the file handle (not a good thing) - unless someone can show me how to avoid that :) - and the latter can incur a performance hit (but can be used with multiprocessing.Process directly).

Example of the former:

filehandle = None

def child_init(filename):
    global filehandle
    filehandle = open(filename,...)
    ../..

def child_target(args):
    ../..

if __name__ == '__main__':
    # some code which defines filename
    proc = multiprocessing.Pool(processes=1,initializer=child_init,initargs=[filename])
    proc.apply(child_target,args)

这篇关于Python多处理,ValueError:关闭文件的I / O操作的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆