打印出由Tensorflow自动选择的随机种子 [英] Printing out the random seed automatically chosen by tensorflow
问题描述
我正在研究神经网络的超参数优化.我将模型运行了20个纪元.在找出最佳的超参数之后,我再次单独运行了相同的模型(现在没有超参数优化),但是得到了不同的结果.不仅如此,我发现执行超参数优化时达到的值(准确性)发生在最后一个时期(第20个).另一方面,当我再次运行相同的模型时,我发现达到的精度要到200个纪元.但是,这些值略少一些.下图是这样的:
I was working on hyperparameter optimization for neural network. I was running the model for 20 epochs. After figuring out the best hyperparameters, I ran the same model again alone (now no hyperparameter optimization) but I got different results. Not only that, I figured out that the value (accuracy) reached while performing hyperparameter optimization occured at the last epoch (20th). On the other hand, when I ran the same model again, I figured out that accuracy achieved was not until 200 epochs. Yet, the values were slightly less. Below is the figure:
因此,我想知道那时候张量流选择的随机种子是什么.结果,我对将随机种子设置为某个常数不感兴趣,但是我想看看tensorflow选择了什么.
Therefore, I would like to know what was the random seed chosen by tensorflow at that moment. As a result, I am not interested in setting the random seed to a certain constant, but I would like to see what was chosen by tensorflow.
非常感谢您的帮助!
推荐答案
这个问题非常相似,但是没有答案,请参见评论主题.通常,您无法在任何给定时间提取种子",因为一旦RNG开始工作,就没有种子.
This question is very similar, but it does not have an answer, see the comments thread. In general, you cannot "extract the seed" at any given time, because there is no seed once the RNG has started working.
如果只想查看初始种子,则需要了解图形级和操作级种子(请参见 random_seed.py
):
If you just want to see the initial seed, you need to understand there are graph-level and op-level seeds (see tf.set_random_seed
, and the implementation in random_seed.py
):
- 如果同时设置了两者,则将两者合并以产生实际的种子.
- 如果设置了图种子,但未设置op种子,则根据图种子和"op id"确定种子.
- 如果设置了操作种子,但未设置图种子,则使用默认的图种子
- 如果未设置任何一个,则产生一个随机种子.要了解其来源,请查看
GuardedPhiloxRandom
提供了random.cc
- If both are set then both are combined to produce the actual seed.
- If the graph seed is set but the op seed is not, the seed is determined deterministically from the graph seed and the "op id".
- If the op seed is set but the graph seed is not, then a default graph seed is used
- If none of them is set, then a random seed is produced. To see where this comes from you would look at
GuardedPhiloxRandom
which provides the two numbers that are finally used byPhiloxRandom
. In the case that no seed at all is provided, picks two random value generated from/dev/urandom
, as seen inrandom.cc
顺便说一下,设置它们后,您实际上可以看到它们.您只需要访问您感兴趣的特定随机操作,并读取其属性seed
和seed2
.请注意,TensorFlow公共函数返回一些操作(缩放,置换)的结果,因此您必须稍微爬升"该图才能得到有趣的图:
You can actually see these, by the way, when they are set. You just need to access the specific random operation that you are interested in and read its attributes seed
and seed2
. Note that TensorFlow public functions return the result of a few operations (scaling, displacing), so you have to "climb up" the graph a bit to get to the interesting one:
import tensorflow as tf
def print_seeds(random_normal):
# Get to the random TensorFlow op (RandomStandardNormal) and print seeds
random_op = random_normal.op.inputs[0].op.inputs[0].op
print(random_op.get_attr('seed'), random_op.get_attr('seed2'))
print_seeds(tf.random_normal(()))
# 0 0
print_seeds(tf.random_normal((), seed=100))
# 87654321 100
tf.set_random_seed(200)
print_seeds(tf.random_normal(()))
# 200 15
print_seeds(tf.random_normal((), seed=300))
# 200 300
不幸的是,当未指定种子时,无法检索TensorFlow生成的随机值.这两个随机数传递给 PhiloxRandom
,使用它们来初始化其内部key_
和counter_
变量,无论如何都无法读取它们.
Unfortunately, when the seed is unspecified there is no way to retrieve the random values generated by TensorFlow. The two random numbers are passed to PhiloxRandom
, which uses them to initialize its internal key_
and counter_
variables, which cannot be read out anyhow.
这篇关于打印出由Tensorflow自动选择的随机种子的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!