如果在Keras模型中样本大小不能被batch_size整除怎么办 [英] What if the sample size is not divisible by batch_size in Keras model

查看:649
本文介绍了如果在Keras模型中样本大小不能被batch_size整除怎么办的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

如果我们将批次大小指定为15,而在Keras模型训练中将样本大小指定为15不能被15整除的样本,该怎么办?

What if we specify batch size as 15 and the sample size of 1000 which is not divisible by 15 in Keras model training?.should it still able to train?

我也一直在寻找这个答案,但它没有帮助

also I have looked in to this answer but it's not helping question

任何人都可以解释一下谢谢.

please can anybody explain this Thank you.

推荐答案

大家好,我找到了答案. 如果是这种情况,它将把剩下的10个样本带到时代的最后一步.

Hi guys i found the answer for this. if this is the case it will take it will take the remaining 10 samples to the last step of the epoch.

例如:15x66+10=1000表示将需要66批次的15号批次,而最后的步骤仅需10批次.

Eg: 15x66+10=1000 that means it will take 66 batches of size 15 and for the final steps it takes only 10.

无论如何,这仅适用于input_shape,如果我们使用batch_input_shape,它将给我们一个错误,因为我们在图形级指定了批处理形状.

Anyway this will only work with input_shape,if we use batch_input_shape it will give us an error because we are specifying the batch shape in the graph level.

这篇关于如果在Keras模型中样本大小不能被batch_size整除怎么办的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆