可以产生多个连续发电机吗? [英] Can yield produce multiple consecutive generators?
问题描述
以下是将可迭代项目拆分为子列表的两个函数。我相信这种类型的任务是多次编程的。我使用它们来解析包含 repr
等行的日志文件('result','case',123,4.55)和('dump',..)等等on。
Here are two functions that split iterable items to sub-lists. I believe that this type of task is programmed many times. I use them to parse log files that consist of repr
lines like ('result', 'case', 123, 4.56) and ('dump', ..) and so on.
我想更改这些,以便它们将产生迭代器而不是列表。因为列表可能会变得非常大,但我可以根据前几个项目决定接受或跳过它。此外,如果iter版本可用,我想嵌套它们,但是这些列表版本会通过复制部件而浪费一些内存。
I would like to change these so that they will yield iterators rather than lists. Because the list may grow pretty large, but I may be able to decide to take it or skip it based on first few items. Also, if the iter version is available I would like to nest them, but with these list versions that would waste some memory by duplicating parts.
但是从一个导出多个生成器可迭代的源码对我来说不容易,所以我请求帮助。如果可能的话,我希望避免引入新的课程。
But deriving multiple generators from an iterable source wan't easy for me, so I ask for help. If possible, I wish to avoid introducing new classes.
另外,如果你知道这个问题的更好的标题,请告诉我。
Also, if you know a better title for this question, please tell me.
谢谢!
def cleave_by_mark (stream, key_fn, end_with_mark=False):
'''[f f t][t][f f] (true) [f f][t][t f f](false)'''
buf = []
for item in stream:
if key_fn(item):
if end_with_mark: buf.append(item)
if buf: yield buf
buf = []
if end_with_mark: continue
buf.append(item)
if buf: yield buf
def cleave_by_change (stream, key_fn):
'''[1 1 1][2 2][3][2 2 2 2]'''
prev = None
buf = []
for item in stream:
iden = key_fn(item)
if prev is None: prev = iden
if prev != iden:
yield buf
buf = []
prev = iden
buf.append(item)
if buf: yield buf
编辑:我自己的答案
感谢大家的回答,我可以写出我的要求!当然,对于cleave_for_change函数,我也可以使用 itertools.groupby
。
def cleave_by_mark (stream, key_fn, end_with_mark=False):
hand = []
def gen ():
key = key_fn(hand[0])
yield hand.pop(0)
while 1:
if end_with_mark and key: break
hand.append(stream.next())
key = key_fn(hand[0])
if (not end_with_mark) and key: break
yield hand.pop(0)
while 1:
# allow StopIteration in the main loop
if not hand: hand.append(stream.next())
yield gen()
for cl in cleave_by_mark (iter((1,0,0,1,1,0)), lambda x:x):
print list(cl), # start with 1
# -> [1, 0, 0] [1] [1, 0]
for cl in cleave_by_mark (iter((0,1,0,0,1,1,0)), lambda x:x):
print list(cl),
# -> [0] [1, 0, 0] [1] [1, 0]
for cl in cleave_by_mark (iter((1,0,0,1,1,0)), lambda x:x, True):
print list(cl), # end with 1
# -> [1] [0, 0, 1] [1] [0]
for cl in cleave_by_mark (iter((0,1,0,0,1,1,0)), lambda x:x, True):
print list(cl),
# -> [0, 1] [0, 0, 1] [1] [0]
/
def cleave_by_change (stream, key_fn):
'''[1 1 1][2 2][3][2 2 2 2]'''
hand = []
def gen ():
headkey = key_fn(hand[0])
yield hand.pop(0)
while 1:
hand.append(stream.next())
key = key_fn(hand[0])
if key != headkey: break
yield hand.pop(0)
while 1:
# allow StopIteration in the main loop
if not hand: hand.append(stream.next())
yield gen()
for cl in cleave_by_change (iter((1,1,1,2,2,2,3,2)), lambda x:x):
print list(cl),
# -> [1, 1, 1] [2, 2, 2] [3] [2]
注意:如果有人打算使用这些,请确保在每个级别耗尽发电机,正如安德鲁指出的那样。因为否则外部生成器产生循环将在内部生成器离开的地方重新开始,而不是在下一个块开始的地方重新开始。
CAUTION: If anyone's going to use these, be sure to exhaust the generators at every level, as Andrew pointed out. Because otherwise the outer generator-yielding loop will restart right where the inner generator left instead of where the next "block" begins.
stream = itertools.product('abc','1234', 'ABCD')
for a in iters.cleave_by_change(stream, lambda x:x[0]):
for b in iters.cleave_by_change(a, lambda x:x[1]):
print b.next()
for sink in b: pass
for sink in a: pass
('a', '1', 'A')
('b', '1', 'A')
('c', '1', 'A')
推荐答案
亚当的答案很好。这是为了万一你好奇如何手工完成:
adam's answer is good. this is just in case you're curious how to do it by hand:
def cleave_by_change(stream):
def generator():
head = stream[0]
while stream and stream[0] == head:
yield stream.pop(0)
while stream:
yield generator()
for g in cleave_by_change([1,1,1,2,2,3,2,2,2,2]):
print list(g)
给出:
[1, 1, 1]
[2, 2]
[3]
[2, 2, 2, 2]
(以前的版本需要黑客,或者在python 3中,非本地
因为我分配到 stream
在 generator()
里面(第二个变量也称为) stream
本地到 generator()
默认情况下 - 在评论中记入gnibbler。)
(previous version required a hack or, in python 3, nonlocal
because i assigned to stream
inside generator()
which made (a second variable also called) stream
local to generator()
by default - credit to gnibbler in the comments).
请注意,这种方法很危险 - 如果您不消耗返回的生成器,那么您将获得越来越多,因为使用流不会变小。
note that this approach is dangerous - if you don't "consume" the generators that are returned then you will get more and more, because stream is not getting any smaller.
这篇关于可以产生多个连续发电机吗?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!