将 Tensorflow 2.0 中的内部损失函数记录到 tensorboard [英] Log to tensorboard an internal loss function in Tensorflow 2.0

查看:27
本文介绍了将 Tensorflow 2.0 中的内部损失函数记录到 tensorboard的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我一直在为我正在使用的自定义应用程序实现我的第一个变分自动编码器.一切运行顺利,但我无法在每个批次前向传递中将 KL 术语记录到 Tensorboard.我想要这个是为了深入了解模型并研究 KL 退火技术的效果.

I have been implementing my first Variational Autoencoder for a custom application I am working in. Everything runs smoothly but I am not able to log the KL term to Tensorboard at each batch forward pass. I want this in order to gain inside on the model and to study the effect of KL annealing techniques.

我已经根据 TF 文档网页上的示例定义了我的模型,这意味着通过 Model.add_loss() 函数将 KL 项添加到当前批次损失中强>:

I've defined my model from the examples on the TF documentation webpage, meaning that the KL term is added to the current batch loss by the Model.add_loss() function:

class VariationalAutoEncoder(tf.keras.Model):
def __init__(self,
             original_dim=18,
             intermediate_dim=64,
             latent_dim=2,
             name='Autoencoder',
             **kwargs):
    super(VariationalAutoEncoder, self).__init__(name=name, **kwargs)
    self.original_dim = original_dim
    self.encoder = Encoder(latent_dim=latent_dim,
                           intermediate_dim=intermediate_dim)
    self.decoder = Decoder(original_dim=original_dim,
                           intermediate_dim=intermediate_dim)
    self.kl_loss_weight = 0

def call(self, inputs):
    # Forward pass of the Encoder
    z_mean, z_log_var, z = self.encoder(inputs)
    # Forward pass of the Decoder taken the re-parameterized z latent variable
    reconstructed = self.decoder(z)

    # Add KL divergence regularization loss for this forward pass
    # THIS IS THE VALUE I WANT TO LOG TO TENSORBOARD
    kl_loss = - 0.5 * tf.reduce_mean(z_log_var - tf.square(z_mean) - tf.exp(z_log_var) + 1)

    # HERE !!!! I ADD THE KL TERM weighted by an internal variable.
    self.add_loss(self.kl_loss_weight * kl_loss)
    return reconstructed

我使用内置函数定义、编译和训练我的模型(我没有定义自定义训练循环):

I define, compile and train my model with the built-in functions (I am not defining a custom training loop):

vae = VariationalAutoEncoder(original_dim=18, intermediate_dim=intermediate_dim, latent_dim=latent_dim)

optimizer = tf.keras.optimizers.Adam(learning_rate=lr)

vae.compile(optimizer,
            loss=tf.keras.losses.MeanSquaredError(),
            metrics=[tf.keras.metrics.MeanAbsoluteError(name="reconstruction_MAE")])

通过此设置,我能够获得组合损失函数(重建 + KL)和仅重建(通过使用 MAE 指标)的 Tensorboard 日志.尽管如此,我似乎找不到同时记录 KL 项的方法,因为它的值是在图中计算的.

With this setup, I am able to get in Tensorboard logs of the combined loss function (Reconstruction + KL) and also only reconstruction (by use of the MAE metric). Nevertheless, I cannot seem to find a way to also log the KL term, since its value is computed inside the graph.

我尝试使用多输出配置,但它需要分配常量 loss_weights,因为我想使用 KL 退火,所以我需要避免这种情况.

I have tried to use a multi-output configuration but it requires assigning constant loss_weights, which I need to avoid since I want to use KL annealing.

我想避免编写自定义训练循环,因此我能够做到这一点,但到目前为止我已经尝试了几种想法,现在我相信这可能是唯一的方法.我错了吗?有没有办法将这个内部损失值记录到 Tensorboard 中?

推荐答案

考虑使用 tf.summary.标量:

class VariationalAutoEncoder(tf.keras.Model):
    <...>
    def call(self, inputs):
        <...>
        kl_loss = - 0.5 * tf.reduce_mean(z_log_var - tf.square(z_mean) - tf.exp(z_log_var) + 1)
        tf.summary.scalar('KL loss', kl_loss)
        self.add_loss(self.kl_loss_weight * kl_loss)
        return reconstructed

logdir = "logs/scalars/" + datetime.now().strftime("%Y%m%d-%H%M%S")
file_writer = tf.summary.create_file_writer(logdir + "/metrics")
file_writer.set_as_default()
tensorboard_callback = keras.callbacks.TensorBoard(log_dir=logdir)
vae = VariationalAutoEncoder(original_dim=18, intermediate_dim=intermediate_dim, latent_dim=latent_dim)

optimizer = tf.keras.optimizers.Adam(learning_rate=lr)

vae.compile(optimizer,
            loss=tf.keras.losses.MeanSquaredError(),
            metrics=[tf.keras.metrics.MeanAbsoluteError(name="reconstruction_MAE"),
            callbacks=[tensorboard_callback]

这篇关于将 Tensorflow 2.0 中的内部损失函数记录到 tensorboard的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆