Keras SimpleRNN 的参数数量 [英] Number of parameters for Keras SimpleRNN

查看:72
本文介绍了Keras SimpleRNN 的参数数量的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有一个 SimpleRNN 像:

model.add(SimpleRNN(10, input_shape=(3, 1)))
model.add(Dense(1, activation="linear"))

模型摘要说:

simple_rnn_1 (SimpleRNN)   (None, 10)   120       

我很好奇simple_rnn_1的参数号120.

有人可以回答我的问题吗?

Could you someone answer my question?

推荐答案

当您查看表格的标题时,您会看到标题 Param:

When you look at the headline of the table you see the title Param:

Layer (type)              Output Shape   Param 
===============================================
simple_rnn_1 (SimpleRNN)   (None, 10)    120   

该数字表示相应层中可训练参数(权重和偏差)的数量,在本例中为您的 SimpleRNN.

This number represents the number of trainable parameters (weights and biases) in the respective layer, in this case your SimpleRNN.

权重计算公式如下:

recurrent_weights + input_weights + 偏差

*resp: (num_features + num_units)* num_units + num_units

*resp: (num_features + num_units)* num_units + num_units

说明:

num_units = 等于 RNN 中的单元数

num_units = equals the number of units in the RNN

num_features = 等于输入的特征数量

num_features = equals the number features of your input

现在您的 RNN 中发生了两件事.

Now you have two things happening in your RNN.

首先,您有循环循环,其中将状态循环输入模型以生成下一步.循环步骤的权重为:

First you have the recurrent loop, where the state is fed recurrently into the model to generate the next step. Weights for the recurrent step are:

recurrent_weights = num_units*num_units

recurrent_weights = num_units*num_units

其次,您在每一步都有新的序列输入.

The secondly you have new input of your sequence at each step.

input_weights = num_features*num_units

input_weights = num_features*num_units

(通常将最后一个 RNN 状态和新输入连接起来,然后乘以一个权重矩阵,但输入和最后一个 RNN 状态使用不同的权重)

(Usually both last RNN state and new input are concatenated and then multiplied with one single weight matrix, nevertheless inputs and last RNN state use different weights)

所以现在我们有了权重,缺少的是偏差 - 对于每个单位一个偏差:

So now we have the weights, whats missing are the biases - for every unit one bias:

偏差 = num_units*1

biases = num_units*1

所以最后我们有了公式:

So finally we have the formula:

recurrent_weights + input_weights + 偏差

recurrent_weights + input_weights + biases

num_units* num_units + num_features* num_units + 偏差

=

(num_features + num_units)* num_units + 偏差

在您的情况下,这意味着可训练的参数是:

In your cases this means the trainable parameters are:

10*10 + 1*10 + 10 = 120

我希望这是可以理解的,如果不只是告诉我 - 这样我就可以对其进行编辑以使其更清楚.

I hope this is understandable, if not just tell me - so I can edit it to make it more clear.

这篇关于Keras SimpleRNN 的参数数量的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆