Keras SimpleRNN的参数数量 [英] Number of parameters for Keras SimpleRNN

查看:1378
本文介绍了Keras SimpleRNN的参数数量的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有一个简单的RNN,例如:

I have a simpleRNN like:

model.add(SimpleRNN(10, input_shape=(3, 1)))
model.add(Dense(1, activation="linear"))

模型摘要显示:

simple_rnn_1 (SimpleRNN)   (None, 10)   120       

我对simple_rnn_1的参数编号120感到好奇. 有人可以回答我的问题吗?

I am curious about the parameter number 120 for simple_rnn_1. Could you someone answer my question?

谢谢

推荐答案

当您查看表格标题时,您会看到标题Param:

When you look at the headline of the table you see the title Param:

Layer (type)              Output Shape   Param 
===============================================
simple_rnn_1 (SimpleRNN)   (None, 10)    120   

该数字代表相应层中可训练参数(权重和偏差)的数量,在本例中为您的SimpleRNN.

This number represents the number of trainable parameters (weights and biases) in the respective layer, in this case your SimpleRNN.

计算权重的公式如下:

递归权重+输入权重+偏差

* resp :(数量功能+数量单位)*数量单位+数量单位

*resp: (num_features + num_units)* num_units + num_units

说明:

num_units =等于RNN中的单元数

num_units = equals the number of units in the RNN

num_features =等于您输入的数字特征

num_features = equals the number features of your input

现在您的RNN中发生了两件事.

Now you have two things happening in your RNN.

首先,您有一个循环循环,在循环循环中,将状态循环馈入模型以生成下一步.循环步骤的权重为:

First you have the recurrent loop, where the state is fed recurrently into the model to generate the next step. Weights for the recurrent step are:

recurrent_weights = num_units * num_units

recurrent_weights = num_units*num_units

第二步,您在每一步都有新的序列输入.

The secondly you have new input of your sequence at each step.

input_weights = num_features * num_units

input_weights = num_features*num_units

(通常,最后一个RNN状态和新输入都被串联,然后与一个权重矩阵相乘,但是输入和最后一个RNN状态使用不同的权重)

(Usually both last RNN state and new input are concatenated and then multiplied with one single weight matrix, nevertheless inputs and last RNN state use different weights)

因此,现在有了权重,所缺少的是偏差-对于每个单元一个偏差:

So now we have the weights, whats missing are the biases - for every unit one bias:

偏见 = num_units * 1

biases = num_units*1

所以最后我们有了公式:

So finally we have the formula:

递归权重 + 输入权重 + 偏向

recurrent_weights + input_weights + biases

num_units * num_units + num_features * num_units +偏差

=

(num_features + num_units)* num_units +偏差

在您的情况下,这意味着可训练的参数是:

In your cases this means the trainable parameters are:

10 * 10 + 1 * 10 + 10 = 120

我希望这是可以理解的,即使不仅仅是告诉我-我也可以对其进行编辑以使其更加清晰.

I hope this is understandable, if not just tell me - so I can edit it to make it more clear.

这篇关于Keras SimpleRNN的参数数量的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆