Keras:2D输入-> 2D输出? [英] Keras: 2D input -> 2D output?

查看:97
本文介绍了Keras:2D输入-> 2D输出?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想构建一个神经网络来学习一组标准特征向量.因此,该集合的形状为(N,100),其中N是样本数.但是,标签集的形状为(Nx18)(例如,每个标签"是由18个元素组成的另一个数组).我对keras和神经网络还很陌生,我只知道一维标签(例如二进制分类中的0或1)时如何处理标签.如何处理多维输出?

I want to build a neural network to learn a set of standard feature vectors. The set is thus of shape (N,100), where N is the number of samples. However, set of labels is of shape (Nx18) (e.g. each "label" is another array of 18 elements). I'm quite new to keras and neural nets, and I only know how to deal with the label when it is one dimensional (e.g. 0 or 1 in binary classification). How do I can deal with multi-dimensional output?

谢谢!

推荐答案

也许我不完全理解这个问题,但是最简单的方法是使输出层具有18个神经元.每个神经元输出一个值,即输出将是18个值的向量.

Maybe I don't completely understand the question but the simplest way would be to have an output layer with 18 neuron. Each neuron output one value, i.e. the output will be a vector of 18 values.

一种实现此目的的可能方法是在隐藏层上使用前馈神经网络,例如包含100个神经元.为此,您将需要Keras中的密集层.

One possible way of doing this would be a feed-forward neural network with on hidden layer, e.g. containing 100 neurons. You will need the Dense layer in Keras for this.

nb_hidden = 100

model = Sequential()
model.add(Dense(input_dim = 100, output_dim = nb_hidden)
model.add(Dense(output_dim = 18, activation = 'softmax')
model.compile(loss='categorical_crossentropy', optimizer='adadelta')

请考虑更改隐藏层的数量,常规网络拓扑(例如,包括Dropout层)和激活功能,直到获得良好效果为止.

Consider varying the number of hidden layers, the general network topology (e.g. include a Dropout layer) and the activation functions until you get a good result.

这篇关于Keras:2D输入-> 2D输出?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆