TensorFlow:恢复的变量似乎是随机的 [英] TensorFlow: Restored variables seem random

查看:27
本文介绍了TensorFlow:恢复的变量似乎是随机的的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我在恢复某些变量时遇到问题.当我在更高级别上保存整个模型时,我已经恢复了变量,但是这次我决定只恢复几个变量.在第一次会话之前,我初始化权重:

I'm having an issue restoring some variables. I've already restored variables when I saved the whole model on a higher level, but this time I've decided to only restore a few variables. Before the first session, I initialize the weights:

weights = {
'1': tf.Variable(tf.random_normal([n_input, n_hidden_1], mean=0, stddev=tf.sqrt(2*1.67/(n_input+n_hidden_1))), name='w1')
}
weights_saver = tf.train.Saver(var_list=weights)

然后,在一个会话中,当我训练神经网络时:

Then, in a session, while I train the NN:

with tf.Session(config=tf.ConfigProto(gpu_options=gpu_options)) as sess:
[...]
weights_saver.save(sess, './savedModels/Weights/weights')

然后:

with tf.Session() as sess:
    new_saver = tf.train.import_meta_graph(pathsToVariables + 'Weights/weights.meta')
    new_saver.restore(sess, pathsToVariables + 'Weights/weights')

    weights = 
    {
    '1': tf.Variable(sess.graph.get_tensor_by_name("w1:0"), name='w1', trainable=False)
    }

    sess.run(tf.global_variables_initializer())
    print(sess.run(weights['1']))

但此时,恢复的权重似乎是随机的.事实上,如果我再次执行 sess.run(tf.global_variables_initializer()),权重会有所不同.好像,我恢复了权重初始化的正常功能,而不是训练好的权重.

But at this point, the weights restored seem to be random. And indeed, If I do sess.run(tf.global_variables_initializer()) again, the weights will be different. As if, I restored the normal function of the initialization of the weights but not the trained weights.

我做错了什么?

我的问题清楚了吗?

推荐答案

 weights = 
    {
    '1': tf.Variable(sess.run(sess.graph.get_tensor_by_name("w1:0")), name='w1', trainable=False)
    }

我找到了答案.我需要运行张量来获取值.现在看起来很明显.

I found out the answer. I needed to run the tensors to get the values. It seems obvious now.

这种方式不是从其他值初始化张量的好方法,因为当我们恢复然后创建张量时,它会创建2个同名的张量.或者,如果名称不同,它将从过去的模型中恢复变量,并在以后尝试对其进行优化.最好在前一个会话中恢复变量,存储值,然后关闭会话,打开一个新会话以创建其他所有内容.

This way is not a good way to initialize tensors from other values because it will create 2 tensors with the same name when we restore and then create the tensor. Or, if different names, it will restore the variable from the past model and may try to optimize it later on. It is better to restore the variable in a previous session, store the values, then close the session, open a new one to create everything else.

 with tf.session() as sess: 
    weight1 = sess.run(sess.graph.get_tensor_by_name("w1:0"))

 tf.reset_default_graph() #this will eliminate the variables we restored

 with tf.session() as sess:
    weights = 
       {
       '1': tf.Variable(weight1 , name='w1-bis', trainable=False)
       }
...

我们现在确定恢复的变量不是图表的一部分.

We are now sure the restored variables are not a part of the graph.

这篇关于TensorFlow:恢复的变量似乎是随机的的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆