Keras:如何将学习率输出到张量板上 [英] Keras: how to output learning rate onto tensorboard
本文介绍了Keras:如何将学习率输出到张量板上的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!
问题描述
我添加了一个回调来降低学习速度:
I add a callback to decay learning rate:
keras.callbacks.ReduceLROnPlateau(monitor='val_loss', factor=0.5, patience=100,
verbose=0, mode='auto',epsilon=0.00002, cooldown=20, min_lr=0)
这是我的张量板回调:
keras.callbacks.TensorBoard(log_dir='./graph/rank{}'.format(hvd.rank()), histogram_freq=10, batch_size=FLAGS.batch_size,
write_graph=True, write_grads=True, write_images=False)
我想确保它在我的训练中开始生效,所以我想将学习率输出到tensorbaord.但是我找不到可以设置它的地方.
I want to make sure it have kicked in during my training, So I want to output learning rate onto tensorbaord.But I can not find where I can set it.
我还检查了优化程序的api,但没有运气.
I also checked optimizer api, but no luck.
keras.optimizers.Adam(lr=0.001, beta_1=0.9, beta_2=0.999, epsilon=None, decay=0.0, amsgrad=False)
那么我如何向Tensorboad输出学习率?
So How can I output learning rate to tensorboad?
推荐答案
根据Keras的作者,正确的方法是将TensorBoard
回调子类化:
According to the author of Keras, the proper way is to subclass the TensorBoard
callback:
from keras import backend as K
from keras.callbacks import TensorBoard
class LRTensorBoard(TensorBoard):
def __init__(self, log_dir, **kwargs): # add other arguments to __init__ if you need
super().__init__(log_dir=log_dir, **kwargs)
def on_epoch_end(self, epoch, logs=None):
logs = logs or {}
logs.update({'lr': K.eval(self.model.optimizer.lr)})
super().on_epoch_end(epoch, logs)
然后将其作为callbacks
参数的一部分传递给model.fit
(贷方
Then pass it as part of the callbacks
argument to model.fit
(credit Finncent Price):
model.fit(x=..., y=..., callbacks=[LRTensorBoard(log_dir="/tmp/tb_log")])
这篇关于Keras:如何将学习率输出到张量板上的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!
查看全文