“平板"的作用是什么?在Keras? [英] What is the role of "Flatten" in Keras?

查看:74
本文介绍了“平板"的作用是什么?在Keras?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我试图了解Flatten函数在Keras中的作用.下面是我的代码,它是一个简单的两层网络.它接收形状为(3,2)的二维数据,并输出形状为(1,4)的一维数据:

I am trying to understand the role of the Flatten function in Keras. Below is my code, which is a simple two-layer network. It takes in 2-dimensional data of shape (3, 2), and outputs 1-dimensional data of shape (1, 4):

model = Sequential()
model.add(Dense(16, input_shape=(3, 2)))
model.add(Activation('relu'))
model.add(Flatten())
model.add(Dense(4))
model.compile(loss='mean_squared_error', optimizer='SGD')

x = np.array([[[1, 2], [3, 4], [5, 6]]])

y = model.predict(x)

print y.shape

这将打印出y具有形状(1、4).但是,如果我删除Flatten行,那么它会打印出y具有形状(1、3、4).

This prints out that y has shape (1, 4). However, if I remove the Flatten line, then it prints out that y has shape (1, 3, 4).

我不明白这一点.根据我对神经网络的理解,model.add(Dense(16, input_shape=(3, 2)))函数正在创建一个具有16个节点的隐藏的完全连接层.这些节点中的每一个都连接到3x2输入元素中的每个.因此,该第一层的输出处的16个节点已经是平坦的".因此,第一层的输出形状应为(1、16).然后,第二层将此作为输入,并输出形状为(1、4)的数据.

I don't understand this. From my understanding of neural networks, the model.add(Dense(16, input_shape=(3, 2))) function is creating a hidden fully-connected layer, with 16 nodes. Each of these nodes is connected to each of the 3x2 input elements. Therefore, the 16 nodes at the output of this first layer are already "flat". So, the output shape of the first layer should be (1, 16). Then, the second layer takes this as an input, and outputs data of shape (1, 4).

因此,如果第一层的输出已经平坦"并且形状为(1,16),为什么我需要进一步使其平坦化?

So if the output of the first layer is already "flat" and of shape (1, 16), why do I need to further flatten it?

推荐答案

如果您阅读了 Dense ,您将看到此调用:

If you read the Keras documentation entry for Dense, you will see that this call:

Dense(16, input_shape=(5,3))

将生成一个具有3个输入和16个输出的Dense网络,该网络将独立应用于5个步骤中的每个步骤.因此,如果D(x)将3维矢量转换为16维矢量,则从图层输出的结果将是一系列矢量:形状为(5, 16)[D(x[0,:]), D(x[1,:]),..., D(x[4,:])].为了具有指定的行为,您可以先将输入Flatten输入15维向量,然后应用Dense:

would result in a Dense network with 3 inputs and 16 outputs which would be applied independently for each of 5 steps. So, if D(x) transforms 3 dimensional vector to 16-d vector, what you'll get as output from your layer would be a sequence of vectors: [D(x[0,:]), D(x[1,:]),..., D(x[4,:])] with shape (5, 16). In order to have the behavior you specify you may first Flatten your input to a 15-d vector and then apply Dense:

model = Sequential()
model.add(Flatten(input_shape=(3, 2)))
model.add(Dense(16))
model.add(Activation('relu'))
model.add(Dense(4))
model.compile(loss='mean_squared_error', optimizer='SGD')

由于有些人难以理解-在这里,您可以看到一张解释性的图片:

As some people struggled to understand - here you have an explaining image:

这篇关于“平板"的作用是什么?在Keras?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆