带TF后端的Keras:获取相对于输入的输出梯度 [英] Keras with TF backend: get gradient of outputs with respect to inputs

查看:204
本文介绍了带TF后端的Keras:获取相对于输入的输出梯度的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有一个非常简单的Keras MLP,我正在尝试获取输出相对于输入的梯度.

I have a very simple Keras MLP, and I'm trying to get the gradient of the outputs with respect to the inputs.

我正在使用以下代码:

regressor = Sequential([
    Dense(32, input_shape=(n_features,), activation='relu'),
    Dense(1)
])
regressor.compile(optimizer=SGD(lr=0.1), loss='mse')

regressor.fit(x, y)

output_tens = regressor.layers[-1].output
input_tens = regressor.layers[0].input

grad = tf.gradients(output_tens, input_tens)
with tf.Session() as sess:
    sess.run(grad, feed_dict={input_tens: np.zeros((1, n_features))})

哪个失败并出现以下错误

Which fails with the following error

FailedPreconditionError: Attempting to use uninitialized value dense_7/bias
     [[Node: dense_7/bias/read = Identity[T=DT_FLOAT, _class=["loc:@dense_7/bias"], _device="/job:localhost/replica:0/task:0/cpu:0"](dense_7/bias)]]

(堆栈跟踪很长,我想这不是很有用,所以我不在这里添加它).

(The stack trace is long and, I assume, not very informative, so I'm not adding it here).

我的方法基本上正确吗?我有什么特别的事吗?

Is my approach basically correct? Is there anything special I have to do?

谢谢!

推荐答案

您需要获取keras会话才能使其正常工作:

You need to get your keras session in order to make it work:

import keras.backend as K

with K.get_session() as sess:
    sess.run(grad, feed_dict={input_tens: np.zeros((1, n_features))})

实例化一个新会话时-您没有从keras培训中初始化变量.

When you instantiate a new session - you don't have initialized variables from keras training.

这篇关于带TF后端的Keras:获取相对于输入的输出梯度的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆