训练期间监控体重稀疏 [英] Monitoring weight sparsity during training

查看:57
本文介绍了训练期间监控体重稀疏的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想知道是否有可能在训练期间监视整个网络(不只是一个层)的非零权重的百分比?

I wonder if it is possible to monitor the percentage of nonzero weights of the full network (not just a layer) during training?

例如,我使用

optim = AdagradDAOptimizer(learning_rate=0.01).minimize(my_loss)

for i in range(10):
  sess = tf.Session()
  loss, _ = sess.run([my_loss, optim])

,我想在每次迭代后打印非零权重数与所有权重数之比.有可能吗?

and I would like to print the ratio of the number nonzero weights over the number of all weights after every iteration. Is it possible?

推荐答案

以下代码计算非零权重的数量.

The following code calculate the number of nonzero weights.

import tensorflow as tf
import numpy as np

tvars = sess.run(tf.trainable_variables())
nonzero_parameters = np.sum([np.count_nonzero(var) for var in tvars])

此处显示了如何计算权重总数:

Here shows how to calculate the total number of weights: How to count total number of trainable parameters in a tensorflow model?.

这篇关于训练期间监控体重稀疏的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆