“无"的含义是什么?在KERAS的model.summary中? [英] What is the meaning of the "None" in model.summary of KERAS?

查看:285
本文介绍了“无"的含义是什么?在KERAS的model.summary中?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

输出形状"中的(无,100)是什么意思? 这是(无)"样品编号还是隐藏的尺寸?

What is the meaning of the (None, 100) in Output Shape? Is this("None") the Sample number or the hidden dimension?

推荐答案

None表示此尺寸是可变的.

None means this dimension is variable.

keras模型中的第一维始终是批处理大小.除非在非常特殊的情况下(例如,在处理stateful=True LSTM层时),否则不需要固定的批次大小.

The first dimension in a keras model is always the batch size. You don't need fixed batch sizes, unless in very specific cases (for instance, when working with stateful=True LSTM layers).

这就是为什么在定义模型时通常会忽略此尺寸的原因.例如,当您定义input_shape=(100,200)时,实际上您是在忽略批处理大小,而是定义每个样本"的形状.内部形状为(None, 100, 200),允许批量大小可变,批次中的每个样品均具有形状(100,200).

That's why this dimension is often ignored when you define your model. For instance, when you define input_shape=(100,200), actually you're ignoring the batch size and defining the shape of "each sample". Internally the shape will be (None, 100, 200), allowing a variable batch size, each sample in the batch having the shape (100,200).

然后将在fitpredict方法中自动定义批次大小.

The batch size will be then automatically defined in the fit or predict methods.

其他None尺寸:

Other None dimensions:

不仅批次尺寸可以为None,而且许多其他尺寸也可以.

Not only the batch dimension can be None, but many others as well.

例如,在预期输入为(batchSize, height, width, channels)的2D卷积网络中,您可以使用(None, None, None, 3)之类的形状,从而允许可变的图像大小.

For instance, in a 2D convolutional network, where the expected input is (batchSize, height, width, channels), you can have shapes like (None, None, None, 3), allowing variable image sizes.

在递归网络和一维卷积中,还可以使length/timesteps尺寸变量具有类似(None, None, featuresOrChannels)

In recurrent networks and in 1D convolutions, you can also make the length/timesteps dimension variable, with shapes like (None, None, featuresOrChannels)

这篇关于“无"的含义是什么?在KERAS的model.summary中?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆