拟合多项式的Keras模型 [英] Keras model to fit polynomial

查看:364
本文介绍了拟合多项式的Keras模型的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我从4阶多项式生成了一些数据,并希望在Keras中创建回归模型以适合该多项式.问题在于,拟合后的预测似乎基本上是线性的.因为这是我第一次使用神经网络,所以我认为自己犯了一个非常琐碎而愚蠢的错误.

I generated some data from a 4th degree polynomial and wanted to create a regression model in Keras to fit this polynomial. The problem is that predictions after fitting seem to be basically linear. Since this is my first time working with neural nets I assume I made a very trivial and stupid mistake.

这是我的代码:

model = Sequential()
model.add(Dense(units=200, input_dim=1))
model.add(Activation('relu'))
model.add(Dense(units=45))
model.add(Activation('relu'))
model.add(Dense(units=1))

model.compile(loss='mean_squared_error',
              optimizer='sgd')

model.fit(x_train, y_train, epochs=20, batch_size=50)

loss_and_metrics = model.evaluate(x_test, y_test, batch_size=100)

classes = model.predict(x_test, batch_size=1)

x_trainy_train是numpy数组,其中包含此文件中的前9900个条目

x_train and y_train are numpy arrays containing the first 9900 entries from this file.

我尝试了不同的batch_size,历元数,图层大小和训练数据量.似乎无济于事.

I tried different batch_sizes, number of epochs, layer sizes and amounts of training data. Nothing seems to help.

请指出所有您认为没有意义的内容!

Please point out everything you see that does not make sense!

推荐答案

神经网络通常无法很好地推导多项式函数.但是,如果您的培训和测试数据来自同一范围,则可以取得非常不错的结果.我生成了一些数据并使用了您的代码:

Neural networks generally won't do a good job in extrapolating polynomial functions. However, if your training and testing data are from the same range, you could achieve quite nice results. I generated some data and used your code:

import numpy as np
x_train=np.random.rand(9000)
y_train=x_train**4+x_train**3-x_train
x_train=x_train.reshape(len(x_train),1)

x_test=np.linspace(0,1,100)
y_test=x_test**4+x_test**3-x_test
x_test=x_test.reshape(len(x_test),1)


model = Sequential()
model.add(Dense(units=200, input_dim=1))
model.add(Activation('relu'))
model.add(Dense(units=45))
model.add(Activation('relu'))
model.add(Dense(units=1))

model.compile(loss='mean_squared_error',
              optimizer='sgd')

model.fit(x_train, y_train, epochs=40, batch_size=50, verbose=1)

loss_and_metrics = model.evaluate(x_test, y_test, batch_size=100)

classes = model.predict(x_test, batch_size=1)

test=x_test.reshape(-1)
plt.plot(test,classes,c='r')
plt.plot(test,y_test,c='b')
plt.show()

请注意,我将epochs增加到40,以获得更多的迭代次数和更准确的结果.我还设置了verbose=1以便能够看到损失的表现.损失的确减少到了7.4564e-04,下面是我得到的结果.红线是网络的预测,蓝线是正确的值.您会看到它们彼此之间非常接近.

Note that I increased epochs to 40 to get more iterations and more accurate results. I also set verbose=1 to be able to see how the loss behaves. The loss is indeed decreasing down to 7.4564e-04, and below is the result that I got. The red line is the prediction of the network, and the blue line is the correct value. You can see that they are quite close to each other.

这篇关于拟合多项式的Keras模型的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆