将Tensorboard嵌入元数据链接到检查点 [英] Linking Tensorboard Embedding Metadata to checkpoint

查看:148
本文介绍了将Tensorboard嵌入元数据链接到检查点的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我在tensorflow上使用tflearn包装器来构建模型,并希望将元数据(标签)添加到最终的嵌入可视化中.运行后,有没有办法将metadata.tsv文件链接到保存的检查点?

I'm using the tflearn wrapper over tensorflow to build a model, and would like to add metadata (labels) to the resultant embedding visualization. Is there a way to link a metadata.tsv file to a saved checkpoint after the fact of running it?

我已经在检查点摘要的日志目录中创建了projection_config.pbtxt文件,并且metas.tsv位于同一文件夹中.配置看起来像这样:

I've created a projector_config.pbtxt file in the logdir of the checkpoint summaries, with the metadata.tsv being in the same folder. The config looks like this:

embeddings {
  tensor_name: "Embedding/W"
  metadata_path: "C:/tmp/tflearn_logs/shallow_lstm/"
}

,它是使用文档中的代码创建的- https://www.tensorflow.org/how_tos /embedding_viz/

and was created using the code from the docs - https://www.tensorflow.org/how_tos/embedding_viz/

我已经注释掉了tf.Session部分,希望创建元数据链接而不需要直接在Session对象中创建,但是我不确定是否可行.

I've commented out the tf.Session part in the hopes of creating the metadata link without the need of doing so directly within a Session object, but I'm not sure if that's possible.

from tensorflow.contrib.tensorboard.plugins import projector
#with tf.Session() as sess:
config = projector.ProjectorConfig()
# One can add multiple embeddings.
embedding = config.embeddings.add()
embedding.tensor_name = 'Embedding/W'
# Link this tensor to its metadata file (e.g. labels).
embedding.metadata_path = 'C:/tmp/tflearn_logs/shallow_lstm/'
# Saves a config file that TensorBoard will read during startup.
projector.visualize_embeddings(tf.summary.FileWriter('/tmp/tflearn_logs/shallow_lstm/'), config)

下面是当前嵌入可视化的快照.注意空的元数据.有没有一种方法可以将所需的图元文件直接附加到该嵌入?

Below is a snap of the current embedding visualization. Note the empty metadata. Is there a way to directly attach the desired metafile to this embedding?

推荐答案

我遇到了同样的问题,现在很受欢迎:)

I had the same problem and it is soloved now :)

基本上,您需要做的只是以下3个步骤:

Essentially, all you need to do is following 3 steps:

  1. 保存模型检查点,假设ckeckpoint的目录为ckp_dir;
  2. projector_config.pbtxt metadata.tsv 放入ckp_dir;
  3. 运行tensorboard --logdir=ckp_dir并单击嵌入"标签
  1. save model checkpoint, supposing ckeckpoint's directory is ckp_dir;
  2. place projector_config.pbtxt and metadata.tsv in ckp_dir;
  3. run tensorboard --logdir=ckp_dir and click the Embedding Tab

projector_config.pbtxt 的内容为:

the content of projector_config.pbtxt is :

    embeddings {
      tensor_name: "embedding_name"
      metadata_path: "metatdata.tsv"
    }

这是将嵌入链接到metadata.tsv的关键.在tf.Session()中,我们经常获得嵌入的值,例如sess.run('embedding_name:0').但是在 projector_config.pbtxt 中,我们只需键入tensor_name: "embedding_name".

This is the key to link the embedding to metadata.tsv. In tf.Session(), we often get the embedding's value like sess.run('embedding_name:0'). But in projector_config.pbtxt, we just type tensor_name: "embedding_name".

通常,我们可以在 projector_config.pbtxt 中指定检查点路径和元数据路径,以便我们可以放置 checkpoint projector_config.pbtxt metadata.tsv 在不同目录中.但是我认为这太复杂了.我只是按照上面的方法解决了.

Generally, we can specify the checkpoint path and metadata_path in projector_config.pbtxt so that we can place checkpoint, projector_config.pbtxt and metadata.tsv in different directories. But i think it is too complicated. I just solved it as above.

此处显示的结果

这篇关于将Tensorboard嵌入元数据链接到检查点的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆