非线性回归:为什么不学习模型? [英] Non linear Regression: Why isn't the model learning?

查看:64
本文介绍了非线性回归:为什么不学习模型?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我刚刚开始学习喀拉拉邦.我正在尝试在keras中训练非线性回归模型,但是该模型似乎学不到很多.

I just started learning keras. I am trying to train a non-linear regression model in keras but model doesn't seem to learn much.

#datapoints
X = np.arange(0.0, 5.0, 0.1, dtype='float32').reshape(-1,1)
y = 5 * np.power(X,2) + np.power(np.random.randn(50).reshape(-1,1),3)

#model
model = Sequential()
model.add(Dense(50, activation='relu', input_dim=1))
model.add(Dense(30, activation='relu', init='uniform'))
model.add(Dense(output_dim=1, activation='linear'))

#training
sgd = SGD(lr=0.1);
model.compile(loss='mse', optimizer=sgd, metrics=['accuracy'])
model.fit(X, y, nb_epoch=1000)

#predictions
predictions = model.predict(X)

#plot
plt.scatter(X, y,edgecolors='g')
plt.plot(X, predictions,'r')
plt.legend([ 'Predictated Y' ,'Actual Y'])
plt.show()

我在做什么错了?

推荐答案

您的学习率太高了.

此外,与您的问题无关,但您不应该要求metrics=['accuracy'],因为这是回归设置和

Also, irrelevant to your issue, but you should not ask for metrics=['accuracy'], as this is a regression setting and accuracy is meaningless.

因此,进行了以下更改:

So, with these changes:

sgd = SGD(lr=0.001);
model.compile(loss='mse', optimizer=sgd)

plt.legend([ 'Predicted Y' ,'Actual Y']) # typo in legend :)

以下是一些输出(由于y的随机元素,运行之间的结果会有所不同):

here are some outputs (results will be different among runs, due to the random element of your y):

这篇关于非线性回归:为什么不学习模型?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆