Tensorflow 不是确定性的,它应该在哪里 [英] Tensorflow not being deterministic, where it should

查看:38
本文介绍了Tensorflow 不是确定性的,它应该在哪里的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有一个小型网络.训练 [许多小时] 并保存到检查站.现在,我想在不同的脚本中从检查点恢复并使用它.我重新创建会话:构建整个网络,s.t.使用与训练前完全相同的代码再次创建所有操作.此代码使用 time.time() [每次运行都不同] 为 TF 设置随机种子.

I have a small network. Trained [many hours] and saved to a checkpoint. Now, I want to restore from checkpoint, in a different script, and use it. I recreate the session: build the entire network, s.t. all ops are created again, using the exact same code I did before training. This code sets the random seed for TF, using time.time() [which is different every run].

然后我从检查点恢复.我运行网络,每次运行恢复的网络时都会得到不同的数字 [微小但有意义的差异].至关重要的是,输入是固定的.如果我将随机种子固定为某个值,则非确定性行为就会消失.

I then restore from a checkpoint. I run the network, and get different numbers [small but meaningful differences] every time I run the restored network. Crucially, the input is fixed. If I fixate the random seed to some value, the non deterministic behavior goes away.

我很困惑,因为我认为恢复 [没有变量被给予保存,所以我假设所有图形都被检查点] 消除了这个流程中的所有随机行为.初始化等被恢复的检查点覆盖,这只是向前运行.

I am puzzled because I thought that a restore [no Variables were given to save, so I presume all graph was checkpointed] eliminates all random behavior from this flow. Initializations etc. are overridden by the restored checkpoint, this is only a forward run.

这可能吗?有道理?有没有办法找出我图中的哪些变量或因素不是由恢复的检查点设置的?

Is this possible? make sense? Is there a way to find out what variables or factors in my graph are not set by the restored checkpoint?

推荐答案

这个问题好像在评论里已经回答过了,但还没有人明确写下答案,所以这里是:

It seems as if this question was already answered in the comments but no-one has written down the answer explicitly yet, so here it is:

您期望计算图即使使用不同的随机种子也始终返回相同的值,因为您认为图中不应该有任何依赖于随机种子的 Op.

You were expecting the computation graph to return always the same values even with different random seeds because you thought that there should not be any Op in your graph which depends on the random seed.

你忘记了辍学.

无论如何,我会始终保持随机种子固定不变.然后这个和任何其他随机操作也是确定性的,你的整个训练也可以.如果您在某个时候想知道不同的随机种子得到了多少方差,您可以明确地尝试其他随机种子.

In any case, I would always keep the random seed fixed anyway. Then also this and any other random ops are deterministic, and your whole training can be as well. If you are wondering at some point how much variance you get by different random seeds, you can explicitly try other random seeds.

这篇关于Tensorflow 不是确定性的,它应该在哪里的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆