如何在SciKitLearn中为MLPRegressor确定隐藏层大小? [英] How is the hidden layer size determined for MLPRegressor in SciKitLearn?
问题描述
可以说我正在使用以下代码创建神经网络:
Lets say I'm creating a neural net using the following code:
from sklearn.neural_network import MLPRegressor
model = MLPRegressor(
hidden_layer_sizes=(100,),
activation='identity'
)
model.fit(X_train, y_train)
对于 hidden_layer_sizes
,我只需将其设置为默认值即可.但是,我不太了解它是如何工作的.我的定义中的隐藏层数是多少?是100吗?
For the hidden_layer_sizes
, I simply set it to the default. However, I don't really understand how it works. What is the number of hidden layers in my definition? Is it 100?
推荐答案
来自文档:
hidden_layer_sizes:元组,长度= n_layers-2,默认值(100,)
hidden_layer_sizes : tuple, length = n_layers - 2, default (100,)
第ith个元素代表第i个隐藏层中的神经元数量.
The ith element represents the number of neurons in the ith hidden layer.
它是 length = n_layers-2
,因为您隐藏的层数是 n_layers
的总层数减去输入层的1减去输入层的1输出层.
It is length = n_layers - 2
, because the number of your hidden layers is the total number of layers n_layers
minus 1 for your input layer, minus 1 for your output layer.
在(默认)(100,)
的情况下,这意味着一个隐藏的层,其中包含100个单位(神经元).
In your (default) case of (100,)
, it means one hidden layer of 100 units (neurons).
对于3个分别为100、50和25个单位的隐藏层,它将是
For 3 hidden layers of, say, 100, 50, and 25 units respectively, it would be
hidden_layer_sizes = (100, 50, 25)
请参阅文档中的示例(用于 MLPClassifier
,但逻辑相同).
See the example in the docs (it is for MLPClassifier
, but the logic is identical).
这篇关于如何在SciKitLearn中为MLPRegressor确定隐藏层大小?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!