如何在输出层上使用softmax激活功能,但在TensorFlow中使用中间层? [英] How to use softmax activation function at the output layer, but relus in the middle layers in TensorFlow?

查看:518
本文介绍了如何在输出层上使用softmax激活功能,但在TensorFlow中使用中间层?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有一个由3个隐藏层组成的神经网络(所以我总共有5个层).我想在每个隐藏层使用 Rectified Linear Units ,但是在最外层,我想在Logit上应用 Softmax .我想使用 DNNClassifier .我已经阅读了TensorFlow的官方文档的设置位置他们说的参数activation_fn的值:

I have a neural net of 3 hidden layers (so I have 5 layers in total). I want to use Rectified Linear Units at each of the hidden layers, but at the outermost layer I want to apply Softmax on the logits. I want to use the DNNClassifier. I have read the official documentation of the TensorFlow where for setting value of the parameter activation_fn they say:

activation_fn:应用于每个层的激活功能.如果为None,将使用tf.nn.relu.

activation_fn: Activation function applied to each layer. If None, will use tf.nn.relu.

我知道我总是可以编写自己的模型并使用激活函数的任意组合.但是,由于DNNClassifier更具体,我想诉诸于此.到目前为止,我有:

I know I can always write my own model and use any arbitrary combination of the activation functions. But as the DNNClassifier is more concrete, I want to resort to that. So far I have:

classifier = tf.contrib.learn.DNNClassifier(
  feature_columns=features_columns,
  hidden_units=[10,20,10],
  n_classes=3
  # , activation_fn:::: I want something like below
  # activation_fn = [relu,relu,relu,softmax]
)

推荐答案

很抱歉,但是仅使用一个DNNClassifier是不可能的. 如您的示例所示,您可以提供一个activation_fn

Sorry to say, but this is not possible using only one DNNClassifier. As you show in your example, you can supply an activation_fn

激活功能应用于每个图层.如果为None,将使用tf.nn.relu.

Activation function applied to each layer. If None, will use tf.nn.relu.

但不是每一层都单独一个.要解决您的问题,您必须将此分类器链接到另一个具有tanh指令功能的层.

But not a seperate one for each layer. To solve your problem, you have to chain this classifier to another layer that does have the tanh actication function.

这篇关于如何在输出层上使用softmax激活功能,但在TensorFlow中使用中间层?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆