如何使Keras神经网络在虹膜数据上的表现优于Logistic回归 [英] How to make Keras Neural Net outperforming Logistic Regression on Iris data

查看:143
本文介绍了如何使Keras神经网络在虹膜数据上的表现优于Logistic回归的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在将Keras神经网络与简单的从关于IRIS数据的Scikit学习.我希望Keras-NN的性能会更好,如此信息.

I am comparing Keras Neural-Net with simple Logistic Regression from Scikit-learn on IRIS data. I expect that Keras-NN will perform better, as suggested by this post.

但是为什么通过模仿那里的代码,Keras-NN的结果却比 逻辑回归?

But why by mimicking the code there, the result of Keras-NN is lower than Logistic regression?

import seaborn as sns
import numpy as np
from sklearn.cross_validation import train_test_split
from sklearn.linear_model import LogisticRegressionCV
from keras.models import Sequential
from keras.layers.core import Dense, Activation
from keras.utils import np_utils

# Prepare data
iris = sns.load_dataset("iris")
X = iris.values[:, 0:4]
y = iris.values[:, 4]

# Make test and train set
train_X, test_X, train_y, test_y = train_test_split(X, y, train_size=0.5, random_state=0)

################################
# Evaluate Logistic Regression
################################
lr = LogisticRegressionCV()
lr.fit(train_X, train_y)
pred_y = lr.predict(test_X)
print("Test fraction correct (LR-Accuracy) = {:.2f}".format(lr.score(test_X, test_y)))



################################
# Evaluate Keras Neural Network
################################

# Make ONE-HOT
def one_hot_encode_object_array(arr):
    '''One hot encode a numpy array of objects (e.g. strings)'''
    uniques, ids = np.unique(arr, return_inverse=True)
    return np_utils.to_categorical(ids, len(uniques))


train_y_ohe = one_hot_encode_object_array(train_y)
test_y_ohe = one_hot_encode_object_array(test_y)

model = Sequential()
model.add(Dense(16, input_shape=(4,)))
model.add(Activation('sigmoid'))
model.add(Dense(3))
model.add(Activation('softmax'))
model.compile(loss='categorical_crossentropy', metrics=['accuracy'], optimizer='adam')

# Actual modelling
model.fit(train_X, train_y_ohe, verbose=0, batch_size=1)
score, accuracy = model.evaluate(test_X, test_y_ohe, batch_size=16, verbose=0)
print("Test fraction correct (NN-Score) = {:.2f}".format(score))
print("Test fraction correct (NN-Accuracy) = {:.2f}".format(accuracy))

我正在使用此版本的Keras

I'm using this version of Keras

In [2]: keras.__version__
Out[2]: '1.0.1'

结果显示:

Test fraction correct (LR-Accuracy) = 0.83
Test fraction correct (NN-Score) = 0.75
Test fraction correct (NN-Accuracy) = 0.60

根据该帖子,Keras的准确性应为0.99.出了什么问题?

According to that post, the accuracy of Keras should be 0.99. What went wrong?

推荐答案

您的神经网络非常简单.尝试通过向其中添加更多的神经元和层来创建深度神经网络.同样,扩展功能也很重要.尝试glorot_uniform初始化程序.最后但并非最不重要的一点是,增加时期,看看损失是否在每个时期都在减少.

Your neural network is quite simple. Try creating Deep neural network by adding more neurons and layers into it. Also, it's important to scale your features. Try glorot_uniform initializer. Last but not least, increase epoch and see if loss is decreasing with each epoch.

所以你去了:

model = Sequential()
model.add(Dense(input_dim=4, output_dim=512, init='glorot_uniform'))
model.add(PReLU(input_shape=(512,)))
model.add(BatchNormalization((512,)))
model.add(Dropout(0.5))

model.add(Dense(input_dim=512, output_dim=512, init='glorot_uniform'))
model.add(PReLU(input_shape=(512,)))
model.add(BatchNormalization((512,)))
model.add(Dropout(0.5))

model.add(Dense(input_dim=512, output_dim=512, init='glorot_uniform'))
model.add(PReLU(input_shape=(512,)))
model.add(BatchNormalization((512,)))
model.add(Dropout(0.5))

model.add(Dense(input_dim=512, output_dim=512, init='glorot_uniform'))
model.add(PReLU(input_shape=(512,)))
model.add(BatchNormalization((512,)))
model.add(Dropout(0.5))

model.add(Dense(input_dim=512, output_dim=512, init='glorot_uniform'))
model.add(PReLU(input_shape=(512,)))
model.add(BatchNormalization((512,)))
model.add(Dropout(0.5))

model.add(Dense(input_dim=512, output_dim=3, init='glorot_uniform'))
model.add(Activation('softmax'))

在第120个时代达到0.97左右

This reaches around 0.97 in 120th epoch

这篇关于如何使Keras神经网络在虹膜数据上的表现优于Logistic回归的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆