TensorFlow火车批次有多个时期? [英] TensorFlow train batches for multiple epochs?

查看:116
本文介绍了TensorFlow火车批次有多个时期?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我不知道如何针对多个时期运行tf.train.batch的结果.它当然会用完一次,我不知道如何重新启动它.

I don't understand how to run the result of tf.train.batch for multiple epochs. It runs out once of course and I don't know how to restart it.

  • 也许我可以使用tile重复它,这很复杂但在此进行了完整的描述.
  • 如果我每次都可以重画一个批次,那将很好-我需要在0到num_examples之间的batch_size个随机整数. (我的示例全部位于本地RAM中).我还没有找到一种简单的方法来立即获得这些随机抽奖.
  • 理想情况下重复批处理时也有改组,但是对我来说,先运行一个纪元然后改组是更有意义的,而不是将训练空间加入自己num_epochs大小,然后随机播放.
  • Maybe I can repeat it using tile, which is complicated but described in full here.
  • If I can redraw a batch each time that would be fine -- I would need batch_size random integers between 0 and num_examples. (My examples all sit in local RAM). I haven't found an easy way to get these random draws at once.
  • Ideally there is a reshuffle too when the batch is repeated, but it makes more sense to me to run an epoch then reshuffle, etc., instead of join the training space to itself num_epochs size, then shuffle.

我认为这很令人困惑,因为由于我的输入适合内存,所以我并没有真正建立输入管道,但是我仍然需要建立批处理,改组和多个时期,这可能需要更多关于输入管道的知识. /p>

I think this is confusing because I'm not really building an input pipeline since my input fits in memory, but yet I still need to be building out batching, shuffling and multiple epochs which possibly requires more knowledge of input pipeline.

推荐答案

tf.train.batch只是将上游样本分为几批,仅此而已.它打算在输入管道的 end 中使用.数据和纪元在上游处理.

tf.train.batch simply groups upstream samples into batches, and nothing more. It is meant to be used at the end of an input pipeline. Data and epochs are dealt with upstream.

例如,如果您的训练数据适合张量,则可以使用 tf.train.slice_input_producer 生成样本.此函数具有改组和时期的参数.

For example, if your training data fits into a tensor, you could use tf.train.slice_input_producer to produce samples. This function has arguments for shuffling and epochs.

这篇关于TensorFlow火车批次有多个时期?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆