使用 Tensorboard 实时监控训练并可视化模型架构 [英] Using Tensorboard to monitor training real time and visualize the model architecture

查看:29
本文介绍了使用 Tensorboard 实时监控训练并可视化模型架构的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在学习使用 Tensorboard -- Tensorflow 2.0.

I am learning to use Tensorboard -- Tensorflow 2.0.

特别是,我想实时监控学习曲线,并直观地检查和交流我的模型的架构.

In particular, I would like to monitor the learning curves realtime and also to visually inspect and communicate the architecture of my model.

下面我将提供可重现示例的代码.

Below I will provide code for a reproducible example.

我有三个问题:

  1. 虽然我在培训结束后得到了学习曲线,但我不知道我应该怎么做才能实时监控它们

  1. Although I get the learning curves once the training is over I don't know what I should do to monitor them in real time

我从 Tensorboard 得到的学习曲线与 history.history 的情节不一致.事实上,它的逆转很奇怪,也很难解释.

The learning curve I get from Tensorboard does not agree with the plot of history.history. In fact is bizarre and difficult to interpret its reversals.

我无法理解图表.我已经训练了一个序列模型,其中包含 5 个密集层和中间的 dropout 层.Tensorboard 向我展示的是其中包含更多元素的东西.

I can not make sense of the graph. I have trained a sequential model with 5 dense layers and dropout layers in between. What Tensorboard shows me is something which much more elements in it.

我的代码如下:

from keras.datasets import boston_housing

(train_data, train_targets), (test_data, test_targets) = boston_housing.load_data()

inputs = Input(shape = (train_data.shape[1], ))
x1 = Dense(100, kernel_initializer = 'he_normal', activation = 'elu')(inputs)
x1a = Dropout(0.5)(x1)
x2 = Dense(100, kernel_initializer = 'he_normal', activation = 'elu')(x1a)
x2a = Dropout(0.5)(x2)
x3 = Dense(100, kernel_initializer = 'he_normal', activation = 'elu')(x2a)
x3a = Dropout(0.5)(x3)
x4 = Dense(100, kernel_initializer = 'he_normal', activation = 'elu')(x3a)
x4a = Dropout(0.5)(x4)
x5 = Dense(100, kernel_initializer = 'he_normal', activation = 'elu')(x4a)
predictions = Dense(1)(x5)
model = Model(inputs = inputs, outputs = predictions)

model.compile(optimizer = 'Adam', loss = 'mse')

logdir="logs\\fit\\" + datetime.now().strftime("%Y%m%d-%H%M%S")
tensorboard_callback = keras.callbacks.TensorBoard(log_dir=logdir)

history = model.fit(train_data, train_targets,
          batch_size= 32,
          epochs= 20,
          validation_data=(test_data, test_targets),
          shuffle=True,
          callbacks=[tensorboard_callback ])

plt.plot(history.history['loss'])
plt.plot(history.history['val_loss'])

plt.plot(history.history['val_loss'])

推荐答案

我认为您可以做的是在对模型调用 .fit() 之前启动 TensorBoard.如果您使用的是 IPython(Jupyter 或 Colab),并且已经安装了 TensorBoard,那么您可以通过以下方式修改代码;

I think what you can do is to launch TensorBoard before calling .fit() on your model. If you are using IPython (Jupyter or Colab), and have already installed TensorBoard, here's how you can modify your code;

from keras.datasets import boston_housing

(train_data, train_targets), (test_data, test_targets) = boston_housing.load_data()

inputs = Input(shape = (train_data.shape[1], ))
x1 = Dense(100, kernel_initializer = 'he_normal', activation = 'relu')(inputs)
x1a = Dropout(0.5)(x1)
x2 = Dense(100, kernel_initializer = 'he_normal', activation = 'relu')(x1a)
x2a = Dropout(0.5)(x2)
x3 = Dense(100, kernel_initializer = 'he_normal', activation = 'relu')(x2a)
x3a = Dropout(0.5)(x3)
x4 = Dense(100, kernel_initializer = 'he_normal', activation = 'relu')(x3a)
x4a = Dropout(0.5)(x4)
x5 = Dense(100, kernel_initializer = 'he_normal', activation = 'relu')(x4a)
predictions = Dense(1)(x5)
model = Model(inputs = inputs, outputs = predictions)

model.compile(optimizer = 'Adam', loss = 'mse')

logdir="logs\\fit\\" + datetime.now().strftime("%Y%m%d-%H%M%S")
tensorboard_callback = keras.callbacks.TensorBoard(log_dir=logdir)

在另一个单元格中,您可以运行;

In another cell, you can run;

# Magic func to use TensorBoard directly in IPython
%load_ext tensorboard

通过在另一个单元中运行它来启动 TensorBoard;

Launch TensorBoard by running this in another cell;

# Launch TensorBoard with objects in the log directory
# This should launch tensorboard in your browser, but you may not see your metadata.
%tensorboard --logdir=logdir 

你终于可以在另一个单元格中对你的模型调用 .fit();

And you can finally call .fit() on your model in another cell;

history = model.fit(train_data, train_targets,
          batch_size= 32,
          epochs= 20,
          validation_data=(test_data, test_targets),
          shuffle=True,
          callbacks=[tensorboard_callback ])

plt.plot(history.history['loss'])
plt.plot(history.history['val_loss'])

如果您不使用 IPython,您可能只需要在训练模型期间或之前启动 TensorBoard 以实时监控它.

If you are not using IPython, you probably just have to launch TensorBoard during or before training your model to monitor it in real-time.

这篇关于使用 Tensorboard 实时监控训练并可视化模型架构的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆