Keras在激活功能之前检索节点的值 [英] Keras retrieve value of node before activation function

查看:98
本文介绍了Keras在激活功能之前检索节点的值的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

想象一个具有以下结构的最后两层的全连接神经网络:

Imagine a fully-connected neural network with its last two layers of the following structure:

[Dense]
    units = 612
    activation = softplus

[Dense]
    units = 1
    activation = sigmoid

网络的输出值为1,但是我想知道S形函数的输入x是什么(必须为高数字,因为sigm(x)在此处为1).

The output value of the net is 1, but I'd like to know what the input x to the sigmoidal function was (must be some high number, since sigm(x) is 1 here).

下面的 indraforyou的答案我设法检索了Keras层的输出和权重:

Folllowing indraforyou's answer I managed to retrieve the output and weights of Keras layers:

outputs = [layer.output for layer in model.layers[-2:]]
functors = [K.function( [model.input]+[K.learning_phase()], [out] ) for out in outputs]

test_input = np.array(...)
layer_outs = [func([test_input, 0.]) for func in functors]

print layer_outs[-1][0]  # -> array([[ 1.]])

dense_0_out = layer_outs[-2][0]                           # shape (612, 1)
dense_1_weights = model.layers[-1].weights[0].get_value() # shape (1, 612)
dense_1_bias = model.layers[-1].weights[1].get_value()

x = np.dot(dense_0_out, dense_1_weights) + dense_1_bias
print x # -> -11.7

x怎么可以是负数?在这种情况下,最后一层的输出应为比1.0更接近0.0的数字. dense_0_outdense_1_weights是错误的输出还是权重?

How can x be a negative number? In that case the last layers output should be a number closer to 0.0 than 1.0. Are dense_0_out or dense_1_weights the wrong outputs or weights?

推荐答案

由于您使用的是get_value(),因此我假设您使用的是Theano后端.要在S型激活之前获取节点的值,可以遍历计算图.

Since you're using get_value(), I'll assume that you're using Theano backend. To get the value of the node before the sigmoid activation, you can traverse the computation graph.

可以使用所有者字段将图形从输出(某些计算的结果)开始向下遍历到其输入.

The graph can be traversed starting from outputs (the result of some computation) down to its inputs using the owner field.

在您的情况下,您需要的是S型激活操作的输入x. S型op的输出为model.output.将它们放在一起,变量xmodel.output.owner.inputs[0].

In your case, what you want is the input x of the sigmoid activation op. The output of the sigmoid op is model.output. Putting these together, the variable x is model.output.owner.inputs[0].

如果打印出该值,则会看到Elemwise{add,no_inplace}.0,这是逐个元素的加法运算.可以从源代码Dense.call()中的>:

If you print out this value, you'll see Elemwise{add,no_inplace}.0, which is an element-wise addition op. It can be verified from the source code of Dense.call():

def call(self, inputs):
    output = K.dot(inputs, self.kernel)
    if self.use_bias:
        output = K.bias_add(output, self.bias)
    if self.activation is not None:
        output = self.activation(output)
    return output

激活功能的输入是K.bias_add()的输出.

The input to the activation function is the output of K.bias_add().

只需对代码进行少量修改,即可在激活之前获取节点的值:

With a small modification of your code, you can get the value of the node before activation:

x = model.output.owner.inputs[0]
func = K.function([model.input] + [K.learning_phase()], [x])
print func([test_input, 0.])

对于使用TensorFlow后端的任何人:请改用x = model.output.op.inputs[0].

For anyone using TensorFlow backend: use x = model.output.op.inputs[0] instead.

这篇关于Keras在激活功能之前检索节点的值的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆