在 TensorFlow 的持续模拟中更改超参数 [英] Change Hyperparameters on an ongoing simulation of TensorFlow

查看:64
本文介绍了在 TensorFlow 的持续模拟中更改超参数的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想知道是否可以更改一些超参数,比如在 TensorFlow 模拟期间实时更改学习率或正则化.

I wonder if it is possible to change some hyperparameters, let's say learning rate or regularization in real time during a TensorFlow simulation.

类似于:您正在监控神经网络 (NN) 的成本函数,然后您决定如果减少正则化项,您的 NN 可能会做得更好.但是您希望在不中断一切的情况下执行此操作.例如,只需在某处输入新值,然后在下一个时期更改正则化.

Something like: You are monitoring the cost function of your neural net(NN) and then you decide that your NN could be doing better if you reduce the regularization term. But you would like to do this without interrupting everything. Just typing the new value in somewhere and then changing the regularization in the next epoch, for example.

推荐答案

你可以只声明你的超参数有 placeholder 或不可训练的 Variable,并将它们更改为您需要使用 feed_dict.

You could just declare your hyper parameters has placeholder or a not trainable Variable, and change them as you need it using the feed_dict.

lr = tf.get_variable('learning_rate',initializer=tf.constant(1), trainable=False)

with tf.Session() as sess:
    sess.run(tf.global_variables_initializer())
    print(sess.run(lr)) # prints 1
    print(sess.run(lr, {lr: 10})) # prints 10

这篇关于在 TensorFlow 的持续模拟中更改超参数的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆