tf.sign 如何与反向传播一起工作? [英] How does tf.sign work with back propagation?
问题描述
我从 TensorFlow 开始,我正在尝试创建一个二进制网络.更具体地说,我正在寻找一个散列网络,它将图像编码为大小为 n 的二进制向量.所以我有一个经典网络,我在softmax之前对最后一层进行二值化.
I'm starting with TensorFlow, and I am trying to create a binary network. More specifically, I'm looking for a hashing network that would encode an image in a binary vector of size n. So I have a classic network, and I binarize the last layer before softmax.
TensorFlow 提供了一个 tf.sign
函数可以做到这一点,但我无法弄清楚反向传播在实践中如何工作,因为它不是连续的.
TensorFlow provides a tf.sign
function that can do just that, but I can't figure out how the back-propagation can work in practice, since it's not continuous.
有什么想法吗?
推荐答案
不幸的是 tf.sign()
的梯度总是被定义为 0
(可能是因为有没有很好的定义方法.)
Unfortunately tf.sign()
's gradient is always defined as 0
(probably because there is no good way to define it.)
https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/ops/math_grad.py#L542
@ops.RegisterGradient("Sign")
def _SignGrad(op, _):
"""Returns 0."""
x = op.inputs[0]
return array_ops.zeros(array_ops.shape(x), dtype=x.dtype)
这篇关于tf.sign 如何与反向传播一起工作?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!