从“具有”范围内产生收益是否安全?在Python中阻止(为什么)? [英] Is it safe to yield from within a "with" block in Python (and why)?
问题描述
协程与资源获取的结合似乎会带来一些意想不到的(或不直观的)后果。
The combination of coroutines and resource acquisition seems like it could have some unintended (or unintuitive) consequences.
基本问题是,类似这样的事情是否有效:
The basic question is whether or not something like this works:
def coroutine():
with open(path, 'r') as fh:
for line in fh:
yield line
这样做。 (您可以测试!)
Which it does. (You can test it!)
更深层的担忧是 with
应该是最终
,其中您确保在块末释放资源。协程可以从 within with with
块中暂停并恢复执行,因此如何解决冲突?
The deeper concern is that with
is supposed to be something an alternative to finally
, where you ensure that a resource is released at the end of the block. Coroutines can suspend and resume execution from within the with
block, so how is the conflict resolved?
例如,如果您在协程尚未返回的情况下在协程内部和外部都打开了一个具有读/写功能的文件,则:
For example, if you open a file with read/write both inside and outside a coroutine while the coroutine hasn't yet returned:
def coroutine():
with open('test.txt', 'rw+') as fh:
for line in fh:
yield line
a = coroutine()
assert a.next() # Open the filehandle inside the coroutine first.
with open('test.txt', 'rw+') as fh: # Then open it outside.
for line in fh:
print 'Outside coroutine: %r' % repr(line)
assert a.next() # Can we still use it?
更新
我要写上一个示例中的-locked文件句柄争用,但是由于大多数OS按进程分配文件句柄,因此那里没有争用。 (@Miles指出该示例没有多大意义。)这是我修改后的示例,它显示了一个实际的死锁条件:
Update
I was going for write-locked file handle contention in the previous example, but since most OSes allocate filehandles per-process there will be no contention there. (Kudos to @Miles for pointing out the example didn't make too much sense.) Here's my revised example, which shows a real deadlock condition:
import threading
lock = threading.Lock()
def coroutine():
with lock:
yield 'spam'
yield 'eggs'
generator = coroutine()
assert generator.next()
with lock: # Deadlock!
print 'Outside the coroutine got the lock'
assert generator.next()
推荐答案
我真的不明白您要问的是什么冲突,也不是这个示例的问题:对于同一文件有两个并存的独立句柄是很好的。
I don't really understand what conflict you're asking about, nor the problem with the example: it's fine to have two coexisting, independent handles to the same file.
我不知道我是在回答您的问题时得知的,在生成器上有一个新的close()方法:
One thing I didn't know that I learned in response to your question it that there is a new close() method on generators:
close()
在生成器内引发一个新的GeneratorExit
异常,以终止迭代。收到此异常后,生成器的代码必须引发GeneratorExit
或StopIteration
。
close()
raises a newGeneratorExit
exception inside the generator to terminate the iteration. On receiving this exception, the generator’s code must either raiseGeneratorExit
orStopIteration
.
close()
在生成器被垃圾回收时被调用,因此这意味着生成器的代码在生成器被销毁之前有最后的机会运行。最后的机会意味着生成器中的 try ... finally
语句现在可以保证正常工作; finally
子句现在将始终有运行的机会。这似乎有点琐碎的语言琐事,但实际上要使用 ,实际上必须使用生成器和
try ...最终
PEP 343描述的c $ c>语句。
close()
is called when a generator is garbage-collected, so this means the generator’s code gets one last chance to run before the generator is destroyed. This last chance means that try...finally
statements in generators can now be guaranteed to work; the finally
clause will now always get a chance to run. This seems like a minor bit of language trivia, but using generators and try...finally
is actually necessary in order to implement the with
statement described by PEP 343.
http://docs.python.org/whatsnew/2.5.html#pep-342-new-generator-features
这样可以处理在生成器中使用 with
语句的情况,但它在中间产生但从不产生返回-当生成器被垃圾回收时,将调用上下文管理器的 __ exit __
方法。
So that handles the situation where a with
statement is used in a generator, but it yields in the middle but never returns—the context manager's __exit__
method will be called when the generator is garbage-collected.
编辑:
关于文件句柄问题:我有时会忘记存在不存在的平台类似于POSIX。 :)
With regards to the file handle issue: I sometimes forget that there exist platforms that aren't POSIX-like. :)
就锁而言,我认为拉法夫·道吉德(RafałDowgird)说你只需要知道发电机就像其他发电机一样,就把头撞了钉子。另一个拥有资源的对象。我不认为 with
语句在这里确实没有意义,因为此函数存在相同的死锁问题:
As far as locks go, I think Rafał Dowgird hits the head on the nail when he says "You just have to be aware that the generator is just like any other object that holds resources." I don't think the with
statement is really that relevant here, since this function suffers from the same deadlock issues:
def coroutine():
lock.acquire()
yield 'spam'
yield 'eggs'
lock.release()
generator = coroutine()
generator.next()
lock.acquire() # whoops!
这篇关于从“具有”范围内产生收益是否安全?在Python中阻止(为什么)?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!