培训期间如何更改批量大小? [英] How to change the batch size during training?

查看:64
本文介绍了培训期间如何更改批量大小?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

在培训期间,我想在每个时期更改批次大小(对于

During training, at each epoch, I'd like to change the batch size (for experimental purpose). Creating a custom Callback seems appropriate but batch_size isn't a member of the Model class.

我看到的唯一方法是覆盖 fit_loop ,并在每个循环中将 batch_size 暴露给回调.有没有更清洁或更快速的方法而不使用回调?

The only way I see would be to override fit_loop and expose batch_size to the callback at each loop. Is there a cleaner or faster way to do it without using a callback ?

推荐答案

我认为最好使用自定义数据生成器来控制传递给训练循环的数据,以便可以生成不同大小的批处理,即时处理数据等.这是一个概述:

I think it will be better to use a custom data generator to have control over the data you pass to the training loop, so you can generate batches of different sizes, process data on the fly etc. Here is an outline:

def data_gen(data):
  while True: # generator yields forever
    # process data into batch, it could be any size
    # it's your responsibility to construct a batch
    yield x,y # here x and y are a single batch

现在,您可以使用 model.fit_generator(data_gen(data),steps_per_epoch = 100)进行训练,每个时期将产生100批.如果要将其封装在类中,还可以使用序列.

Now you can train with model.fit_generator(data_gen(data), steps_per_epoch=100) which will yield 100 batches per epoch. You can also use a Sequence if you want to encapsulate this inside a class.

这篇关于培训期间如何更改批量大小?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆