为什么我的神经网络永远不会过拟合? [英] Why does my neural network never overfit?
本文介绍了为什么我的神经网络永远不会过拟合?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!
问题描述
我正在训练带有10个隐藏层的深层残差网络,其中包含游戏数据.
I am training a deep residual network with 10 hidden layers with game data.
有人知道为什么我在这里没有过拟合吗? 经过100次训练后,训练和测验损失仍在减少.
Does anyone have an idea why I don't get any overfitting here? Training and test loss still decreasing after 100 epochs of training.
推荐答案
仅几个建议:
- 对于深度学习,建议甚至进行90/10或95/5拆分(Andrew Ng)
- 曲线之间的微小差异意味着您的
learning_rate
没有被调整;尝试增加它(如果您要实现某种智能" lr-reduce,则可能要增加epochs
的数量) - DNN尝试对少量数据(10-100行)和大量迭代进行过拟合也是合理的
- 检查集合中是否有数据泄漏:每层内部的权重分析可能会帮助您
- for deep learning is recommended to do even 90/10 or 95/5 splitting (Andrew Ng)
- this small difference between curves means that your
learning_rate
is not tuned; try to increase it (and, probably, number ofepochs
if you will implement some kind of 'smart' lr-reduce) - it is also reasonable for DNN to try to overfit with the small amount of data (10-100 rows) and an enormous number of iterations
- check for data leakage in the set: weights analysis inside each layer may help you in this
这篇关于为什么我的神经网络永远不会过拟合?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!
查看全文