StopIteration:generator_output = next(output_generator) [英] StopIteration: generator_output = next(output_generator)
问题描述
我有以下代码可以重写以在大型数据集上工作.我正在使用Python生成器将模型拟合为逐批生成的数据.
I have the following code which I rewrite to work on a large scale dataset. I am using Python generator to Fit the model on data yielded batch-by-batch.
def subtract_mean_gen(x_source,y_source,avg_image,batch):
batch_list_x=[]
batch_list_y=[]
for line,y in zip(x_source,y_source):
x=line.astype('float32')
x=x-avg_image
batch_list_x.append(x)
batch_list_y.append(y)
if len(batch_list_x) == batch:
yield (np.array(batch_list_x),np.array(batch_list_y))
batch_list_x=[]
batch_list_y=[]
model = resnet.ResnetBuilder.build_resnet_18((img_channels, img_rows, img_cols), nb_classes)
model.compile(loss='categorical_crossentropy',
optimizer='adam',
metrics=['accuracy'])
val = subtract_mean_gen(X_test,Y_test,avg_image_test,batch_size)
model.fit_generator(subtract_mean_gen(X_train,Y_train,avg_image_train,batch_size), steps_per_epoch=X_train.shape[0]//batch_size,epochs=nb_epoch,validation_data = val,
validation_steps = X_test.shape[0]//batch_size)
我得到以下错误:
239/249 [===========================>..] - ETA: 60s - loss: 1.3318 - acc: 0.8330Exception in thread Thread-1:
Traceback (most recent call last):
File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
self.run()
File "/usr/lib/python2.7/threading.py", line 754, in run
self.__target(*self.__args, **self.__kwargs)
File "/usr/local/lib/python2.7/dist-packages/keras/utils/data_utils.py", line 560, in data_generator_task
generator_output = next(self._generator)
StopIteration
240/249 [===========================>..] - ETA: 54s - loss: 1.3283 - acc: 0.8337Traceback (most recent call last):
File "cifa10-copy.py", line 125, in <module>
validation_steps = X_test.shape[0]//batch_size)
File "/usr/local/lib/python2.7/dist-packages/keras/legacy/interfaces.py", line 87, in wrapper
return func(*args, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/keras/engine/training.py", line 1809, in fit_generator
generator_output = next(output_generator)
StopIteration
我调查了此处但是,我无法解决引发StopIteration的错误.
I looked into a similar question posted here however, I am not able to resolve the error why StopIteration is raised.
推荐答案
keras的生成器必须是无限的:
Generators for keras must be infinite:
def subtract_mean_gen(x_source,y_source,avg_image,batch):
while True:
batch_list_x=[]
batch_list_y=[]
for line,y in zip(x_source,y_source):
x=line.astype('float32')
x=x-avg_image
batch_list_x.append(x)
batch_list_y.append(y)
if len(batch_list_x) == batch:
yield (np.array(batch_list_x),np.array(batch_list_y))
batch_list_x=[]
batch_list_y=[]
发生错误是因为keras尝试获取新批次,但是您的生成器已经到尽头了. (即使您定义了正确的步骤数,keras也会有一个队列,即使您处于最后一步,该队列也会尝试从生成器中获取更多批次.)
The error happens because keras tries to get a new batch, but your generator has already reached its end. (Even though you defined a correct number of steps, keras has a queue that will be trying to get more batches from the generator even if you are at the last step.)
显然,您有一个默认的队列大小,即10(由于末尾队列试图在末尾获取批处理,所以该例外在末尾出现10个批处理).
Apparently, you've got a default queue size, which is 10 (the exception appears 10 batches before the end because the queue is trying to get a batch after the end).
这篇关于StopIteration:generator_output = next(output_generator)的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!