使用带有插入符号训练的神经网络并调整参数 [英] Using neuralnet with caret train and adjusting the parameters
问题描述
所以我读过一篇论文,该论文使用神经网络对一个数据集进行建模,该数据集与我目前使用的数据集相似.我有 160 个描述符变量,我想为 160 个案例建模(回归建模).我阅读的论文使用了以下参数:-
So I've read a paper that had used neural networks to model out a dataset which is similar to a dataset I'm currently using. I have 160 descriptor variables that I want to model out for 160 cases (regression modelling). The paper I read used the following parameters:-
'对于每个分割,都为 10 个单独的训练测试折叠中的每一个开发了一个模型.具有 33 个输入神经元和 16 个隐藏神经元的三层反向传播网络用于在线权重更新、0.25 学习率和 0.9 动量.对于每个折叠,从总共 50 个不同的随机初始权重起点进行学习,并且允许网络迭代学习时期,直到验证集的平均绝对误差 (MAE) 达到最小值.'
'For each split, a model was developed for each of the 10 individual train-test folds. A three layer back-propagation net with 33 input neurons and 16 hidden neurons was used with online weight updates, 0.25 learning rate, and 0.9 momentum. For each fold, learning was conducted from a total of 50 different random initial weight starting points and the network was allowed to iterate through learning epochs until the mean absolute error (MAE) for the validation set reached a minimum. '
现在他们使用了一个叫做 Emergent 的专业软件来做到这一点,这是一个非常专业的神经元网络模型软件.但是,由于我之前在 R 中做过以前的模型,所以我必须坚持下去.所以我使用 caret train 函数来进行 10 次交叉验证,使用神经网络包进行 10 次.我做了以下事情:-
Now they used a specialist software called Emergent in order to do this, which is a very specialised neuronal network model software. However, as I've done previous models before in R, I have to keep to it. So I'm using the caret train function in order to do 10 cross fold validation, 10 times with the neuralnet package. I did the following:-
cadets.nn <- train(RT..seconds.~., data = cadet, method = "neuralnet", algorithm = 'backprop', learningrate = 0.25, hidden = 3, trControl = ctrl, linout = TRUE)
我这样做是为了尝试尽可能接近论文中使用的参数调整参数,但是我收到以下错误消息:-
I did this to try and tune the parameters as closely to the ones used in the paper, however I get the following error message:-
layer1 layer2 layer3 RMSE Rsquared RMSESD RsquaredSD
1 1 0 0 NaN NaN NA NA
2 3 0 0 NaN NaN NA NA
3 5 0 0 NaN NaN NA NA
Error in train.default(x, y, weights = w, ...) :
final tuning parameters could not be determined
In addition: There were 50 or more warnings (use warnings() to see the first 50)
你知道我做错了什么吗?它在我做 nnet 时工作,但我无法调整参数以使其类似于我试图模仿的论文中使用的参数.
Do you know what I'm doing wrong? It works when I do nnet, but I can't tune the parameters for that to make it similar to the ones used in the paper I'm trying to mimic.
这是我在警告()中得到的五十次:-
This is what I get in the warnings() fifty times:-
1: In eval(expr, envir, enclos) :
model fit failed for Fold01.Rep01: layer1=1, layer2=0, layer3=0 Error in neuralnet(form, data = data, hidden = nodes, ...) :
formal argument "hidden" matched by multiple actual arguments
2: In data.frame(..., check.names = FALSE) :
row names were found from a short variable and have been discarded
3: In eval(expr, envir, enclos) :
model fit failed for Fold01.Rep01: layer1=3, layer2=0, layer3=0 Error in neuralnet(form, data = data, hidden = nodes, ...) :
formal argument "hidden" matched by multiple actual arguments
4: In data.frame(..., check.names = FALSE) :
row names were found from a short variable and have been discarded
5: In eval(expr, envir, enclos) :
model fit failed for Fold01.Rep01: layer1=5, layer2=0, layer3=0 Error in neuralnet(form, data = data, hidden = nodes, ...) :
formal argument "hidden" matched by multiple actual arguments
谢谢!
推荐答案
train
为你设置 hidden
(基于 layer
给出的值>-layer3
.您试图指定该参数两次,因此:
train
sets hidden
for you (based on the values given by layer
-layer3
. You are trying to specify that argument twice, hence:
形参隐藏"与多个实参匹配
formal argument "hidden" matched by multiple actual arguments
HTH,
最大
这篇关于使用带有插入符号训练的神经网络并调整参数的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!