tf.GraphKeys 的使用 [英] Usage of tf.GraphKeys

查看:19
本文介绍了tf.GraphKeys 的使用的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

在 tensorflow 中,有一个 GraphKeys 类.我遇到了很多代码,它被使用过.但是在 tensorflow 文档和代码中都没有很好地解释这个类的用法.

In tensorflow, there's a class GraphKeys. I came across many codes, where it's been used. But it's not explained very well what's the usage of this class both in tensorflow documentation as well as in the codes, where it has been used.

谁能解释一下tf.GraphKey的用法?

谢谢!

推荐答案

据我所知,tf.GraphKeys 是图形中变量和操作的键集合的集合.用法(就像常见的python字典一样)是检索变量和操作.

As far as I know, tf.GraphKeys is a collection of collections of keys for variables and ops in the graph. The usage (just as common python dictionaries) is to retrieve variables and ops.

鉴于此,这里是我遇到的 tf.GraphKeys 的一些子集:

Given that said, here are some subsets of tf.GraphKeys I came across:

  • GLOBAL_VARIABLESLOCAL_VARIABLES 包含图的所有变量,需要在训练前进行初始化.tf.global_variables() 返回列表中的全局变量,可以与 tf.variables_initializer 一起用于初始化.
  • 使用trainable=True 选项创建的变量将被添加到TRAINABLE_VARIABLES 中,并由tf.train 下的任何优化器获取和更新培训.
  • SUMMARIES 包含由 tf.summary 添加的所有摘要的键(scalar, image, 直方图text等).tf.summary.merge_all 收集所有这些键并返回一个要运行并写入文件的操作,以便您可以在张量板上可视化它们.
  • 可以将用于更新某些变量的自定义函数添加到 UPDATE_OPS 并在每次迭代时使用 sess.run(tf.get_collection(tf.GraphKeys.UPDATE_OPS)) 单独运行.在这种情况下,这些变量设置为 trainable=False 以避免被梯度下降更新.
  • 您可以使用 tf.add_to_collection(some_name, var_or_op) 创建自己的集合,并稍后检索变量或 op.您可以使用 tf.get_collection() 检索特定变量或操作并调整 scope.
  • GLOBAL_VARIABLES and LOCAL_VARIABLES contain all variables of the graph, which need to be initialized before training. tf.global_variables() returns the global variables in a list and can be used with tf.variables_initializer for initialization.
  • Variables created with option trainable=True will be added to TRAINABLE_VARIABLES and will be fetched and updated by any optimizer under tf.train during training.
  • SUMMARIES contains keys for all summaries added by tf.summary (scalar, image, histogram, text, etc). tf.summary.merge_all gathers all such keys and returns an op to be run and written to file so that you can visualize them on tensorboard.
  • Custom functions to update some variables can be added to UPDATE_OPS and separately run at each iteration using sess.run(tf.get_collection(tf.GraphKeys.UPDATE_OPS)). In this case, these variables are set trainable=False to avoid being updated by gradient descent.
  • You may create your own collections using tf.add_to_collection(some_name, var_or_op) and retrieve the variable or op later. You may retrieve specific variables or ops using tf.get_collection() and tweak the scope.

这篇关于tf.GraphKeys 的使用的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆