如何在Keras中使用model.reset_states()? [英] How to use model.reset_states() in Keras?
问题描述
我有顺序数据,我声明了一个LSTM模型,该模型在Keras中用x
预测y
.因此,如果我调用model.predict(x1)
和model.predict(x2)
,在两个predict()
之间显式调用model.reset_states
是否正确? model.reset_states
是否清除输入的历史记录,而不是权重,对吗?
I have sequential data and I declared a LSTM model which predicts y
with x
in Keras. So if I call model.predict(x1)
and model.predict(x2)
, Is it correct to call model.reset_states
between the two predict()
explicitly? Does model.reset_states
clear history of inputs, not weights, right?
# data1
x1 = [2,4,2,1,4]
y1 = [1,2,3,2,1]
# dat2
x2 = [5,3,2,4,5]
y2 = [5,3,2,3,2]
在我的实际代码中,我使用model.evaluate()
.在evaluate()
中,是否为每个数据样本隐式调用reset_states
?
And in my actual code, I use model.evaluate()
. In evaluate()
, is reset_states
called implicitly for each data sample?
model.evaluate(dataX, dataY)
推荐答案
reset_states
仅清除网络的隐藏状态.值得一提的是,取决于您的网络中是否设置了选项stateful=True
-此功能的行为可能有所不同.如果未设置-网络中的每一批计算完成后,所有状态都会自动重置(例如,也调用fit
,predict
和evaluate
之后).如果不是这样,则当您要进行连续的模型调用独立时,应每次调用reset_states
.
reset_states
clears only the hidden states of your network. It's worth to mention that depending on if the option stateful=True
was set in your network - the behaviour of this function might be different. If it's not set - all states are automatically reset after every batch computations in your network (so e.g. after calling fit
, predict
and evaluate
also). If not - you should call reset_states
every time, when you want to make consecutive model calls independent.
这篇关于如何在Keras中使用model.reset_states()?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!