神经网络训练中Epoch的意义 [英] Meaning of an Epoch in Neural Networks Training

查看:122
本文介绍了神经网络训练中Epoch的意义的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

当我在阅读如何在 pybrain 中构建 ANN 时,他们说:

while I'm reading in how to build ANN in pybrain, they say:

针对某些时期训练网络.通常你会设置一些东西像这里的 5,

Train the network for some epochs. Usually you would set something like 5 here,

trainer.trainEpochs( 1 )

我寻找那是什么意思,然后我得出结论,我们使用一个时期的数据来更新权重,如果我选择用 5 个时期来训练数据作为 pybrain 建议,数据集将分为 5 个子集,并且wights 最多更新 5 次.

I looked for what is that mean , then I conclude that we use an epoch of data to update weights, If I choose to train the data with 5 epochs as pybrain advice, the dataset will be divided into 5 subsets, and the wights will update 5 times as maximum.

我熟悉在线训练,其中 wights 在每个样本数据或特征向量后更新,我的问题是如何确保 5 个 epochs 足以构建模型并可能设置权重?这种方式在线培训有什么优势?在线训练中也使用了术语epoch",它是指一个特征向量吗?

I'm familiar with online training where the wights are updated after each sample data or feature vector, My question is how to be sure that 5 epochs will be enough to build a model and setting the weights probably? what is the advantage of this way on online training? Also the term "epoch" is used on online training, does it mean one feature vector?

推荐答案

一个 epoch 由一个在训练集上的完整训练周期组成.一旦看到集合中的每个样本,您就重新开始 - 标志着第二个时代的开始.

One epoch consists of one full training cycle on the training set. Once every sample in the set is seen, you start again - marking the beginning of the 2nd epoch.

这与批量或在线培训本身无关.批处理意味着您在时代结束时更新一次(在看到每个样本之后,即#epoch更新)和在线更新您在每个之后更新strong> sample(#samples * #epoch 更新).

This has nothing to do with batch or online training per se. Batch means that you update once at the end of the epoch (after every sample is seen, i.e. #epoch updates) and online that you update after each sample (#samples * #epoch updates).

您无法确定 5 个 epoch 或 500 个是否足以收敛,因为它会因数据而异.当误差收敛或低于某个阈值时,您可以停止训练.这也进入了防止过度拟合的领域.您可以阅读提前停止交叉验证 .

You can't be sure if 5 epochs or 500 is enough for convergence since it will vary from data to data. You can stop training when the error converges or gets lower than a certain threshold. This also goes into the territory of preventing overfitting. You can read up on early stopping and cross-validation regarding that.

这篇关于神经网络训练中Epoch的意义的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆