输入数据集如何馈入神经网络? [英] How is input dataset fed into neural network?

查看:192
本文介绍了输入数据集如何馈入神经网络?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

如果我的数据集中有1000个带有15个特征和1个标签的观测值,那么如何将输入神经元中的数据馈入进行正向传播和反向传播?它是逐行馈送1000个观测值(一次一个),权重随馈入的每个观测值而更新,还是根据输入矩阵给出完整数据,然后以历元数来学习相应的权重值?另外,如果一次喂一次,在那种情况下会是什么时期? 谢谢

If I have 1000 observations in my dataset with 15 features and 1 label, how is the data in input neurons fed for forward pass and back propagation? Is it fed row wise for 1000 observations (one at a time) and weights are updated with each observation fed or full data is given in terms of input matrix and then with number of epochs, the network learns corresponding weight values? Also if it is fed one at time, what is epochs in that case? Thanks

推荐答案

假定数据格式化为行(1000个实例,每个实例具有16个功能,最后一个为标签),则需要提供前15个功能并逐行使用最后一个功能"/标签作为目标.这称为在线学习.在线学习要求您一次输入一个示例中的数据,并对每个示例进行反向传播和权重更新.您可以想象,由于反向传播和对数据的每个实例进行更新,这可能会变得很繁琐.

Assuming that the data is formatted into rows (1000 instances with 16 features each, with the last one being the label), you would feed in the first 15 features row by row and use the last "feature"/label as the target. This is called online learning. Online learning requires you to feed the data in one example at a time and conduct the back propagation and the weight update for every example. As you can imagine this can get quite intensive due to the backpropagation and update for every instance of your data.

您提到的另一个选项是将整个数据馈入网络.由于收敛速度非常慢,因此在实践中效果不佳.

The other option that you mentioned is feeding in the entire data into the network. This performs poorly in practice as the convergence is extremely slow.

实际上,使用的是迷你批次.这涉及发送数据集的一小部分,然后进行反向传播和权重更新.这提供了相对频繁的体重更新以加快学习速度的好处,但强度不如在线学习.有关迷你批次的更多信息,请参见多伦多大学Geoffrey讲座欣顿

In practice, mini-batches are used. This involves sending a small subset of the dataset through and then doing the back propagation and weight update. This provides the benefit of relatively frequent weight updates to speed up learning but is less intensive than the online learning. For more information on mini-batches see this University of Toronto Lecture by Geoffrey Hinton

最后,一个纪元始终贯穿您的所有数据.一次喂一次还是一次喂一次都没关系.

Finally, an epoch is always 1 run through all of your data. It doesn't matter if you feed it in one at a time or all at once.

我希望这可以澄清您的问题.

I hope this clarified your questions.

这篇关于输入数据集如何馈入神经网络?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆