运行生成器功能而不关心项目的更简单方法 [英] Simpler way to run a generator function without caring about items
问题描述
我有一些用例,在这些用例中,我需要运行生成器函数而不关心生成的项.
我不能使它们成为非泛型函数,因为在其他用例中,我当然需要产生的值.
I have some use cases in which I need to run generator functions without caring about the yielded items.
I cannot make them non-generaor functions because in other use cases I certainly need the yielded values.
我目前正在使用一个简单的自制函数来耗尽发电机.
I am currently using a trivial self-made function to exhaust the generators.
def exhaust(generator):
for _ in generator:
pass
我想知道,是否有一种更简单的方法来做到这一点,而我所缺少的呢?
I wondered, whether there is a simpler way to do that, which I'm missing?
修改 遵循一个用例:
def create_tables(fail_silently=True):
"""Create the respective tables."""
for model in MODELS:
try:
model.create_table(fail_silently=fail_silently)
except Exception:
yield (False, model)
else:
yield (True, model)
在某些情况下,我关心错误和成功值……
In some context, I care about the error and success values…
for success, table in create_tables():
if success:
print('Creation of table {} succeeded.'.format(table))
else:
print('Creation of table {} failed.'.format(table), file=stderr)
...,在某些情况下,我只想盲目的"运行该功能:
… and in some I just want to run the function "blindly":
exhaust(create_tables())
推荐答案
为此设置一个for循环可能会相对昂贵,请记住,Python中的 for 循环从根本上是连续执行简单的赋值语句;您将执行 n (生成器中的项目数)分配,但此后只会丢弃分配目标.
Setting up a for loop for this could be relatively expensive, keeping in mind that a for loop in Python is fundamentally successive execution of simple assignment statements; you'll be executing n (number of items in generator) assignments, only to discard the assignment targets afterwards.
您可以改为将生成器的长度设为零.以C速度消耗,并且不像list
和实现迭代器/生成器的其他可调用项那样用尽内存:
You can instead feed the generator to a zero length deque
; consumes at C-speed and does not use up memory as with list
and other callables that materialise iterators/generators:
from collections import deque
def exhaust(generator):
deque(generator, maxlen=0)
取自consume
itertools配方.
Taken from the consume
itertools recipe.
这篇关于运行生成器功能而不关心项目的更简单方法的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!