使用Keras使用相同的神经网络进行分类和回归 [英] Classification and regression using the same Neural Network using Keras

查看:282
本文介绍了使用Keras使用相同的神经网络进行分类和回归的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想构建一个神经网络,同时输出用于分类的标签和用于回归的值.我想使用Keras做到这一点.现在,我的代码仅用于分类:

I would like to build a Neural Network that at the same time output a label for classification and a value for regression. I would like to do that using Keras. Right now my code is only for classification:

 mdl = Sequential()
 mdl.add(Dense(100, activation='relu', input_dim=X_train.shape[1]))
 mdl.add(Dense(200, activation='relu'))
 mdl.add(Dense(100, activation='relu'))
 mdl.add(Dense(6, activation='softmax'))

 mdl.compile(loss='categorical_crossentropy', optimizer='sgd', metrics=['accuracy'])

 # early stopping implementation
 filepath="weights.best.hdf5"
 checkpoint = ModelCheckpoint(filepath, monitor='val_acc', verbose=1, 
 save_best_only=True, mode='max')
 early_stop = EarlyStopping(monitor='val_acc', patience=100, mode='max') 
 callbacks_list = [checkpoint, early_stop]


 # fit network
 history = mdl.fit(X_train, y_train, epochs=2000, batch_size=32, 
 validation_split=0.2, verbose=2, shuffle=True, callbacks=callbacks_list)

因此,现在我在输出层上具有softmax激活函数,该函数与我用于分类的概率相对应.我如何修改此代码以输出也代表我的回归问题的continuos值.我知道Keras Functional API允许指定多输入和多输出网络.任何对我该怎么做有想法的人吗?

So right now I have a softmax activation function on the output layer that correspond to the probability that I use for classification. How can I modify this code to output also a continuos value that will represent my regression problem. I know that Keras Functional API allow to specify multi input and multi output network. Anyone that have an idea on how can I do that?

推荐答案

相同代码的模式略有不同

您的代码可以直接转换为Keras Functional API,如其文档.您需要更改顺序声明

The same code in a slightly different pattern

There's a straightforward transformation of your code to the Keras Functional API as illustrated in their documentation. You'd need to change your Sequential declaration

mdl = Sequential()
mdl.add(Dense(100, activation='relu', input_dim=X_train.shape[1]))
mdl.add(Dense(200, activation='relu'))
mdl.add(Dense(100, activation='relu'))
mdl.add(Dense(6, activation='softmax'))

等效于功能:

inputs = Input(shape=(X_train.shape[1],))
layer1 = Dense(100, activation='relu')(inputs)
layer2 = Dense(200, activation='relu')(layer1)
layer3 = Dense(100, activation='relu')(layer2)
classifier = Dense(6, activation='softmax')(layer3)
mdl = Model(inputs=inputs, outputs=classifier)

(通常人们只是对所有中间层重复使用相同的变量,甚至在文档样本中也是如此,但是此恕我直言有点清楚).

(often people just re-use the same variable for all the intermediate layers, it's even done in the documentation samples but this IMHO is a bit clearer).

完成此操作后,可以添加另一个输出层,该输出层从最后一个密集层layer3开始分支",并设置模型具有两个输出,例如:

Once you've done that, you can add another output layer that "branches" from the last Dense layer layer3, and set that your model has two outputs, for example:

regression = Dense(1, activation='linear')(layer3)
mdl = Model(inputs=inputs, outputs=[classifier, regression])

这篇关于使用Keras使用相同的神经网络进行分类和回归的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆