带有softmax激活的二进制分类始终输出1 [英] Binary classification with softmax activation always outputs 1

查看:221
本文介绍了带有softmax激活的二进制分类始终输出1的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

很抱歉这个问题的质量,但是对于这里的初学者来说,我只是在尝试着泰坦尼克号数据集的运气,但它总是可以预测这名乘客已死亡.我尝试在下面解释代码:

Sorry for the quality of the question but a beginner here , I was just trying my luck with titanic dataset, but it always predicts that the passenger died. I try to explain code below:

import matplotlib.pyplot as plt
import numpy as np
import pandas as pd
import seaborn as sns


import tensorflow as tf

from tensorflow import keras
from tensorflow.keras import layers
from tensorflow.keras import losses
from tensorflow.keras.layers.experimental import preprocessing

import os

加载数据集

dataset_dir = os.path.join(os.getcwd(), 'titanic')
train_url = os.path.join(dataset_dir, 'train.csv')
test_url = os.path.join(dataset_dir, 'test.csv')


raw_train_dataset = pd.read_csv(train_url)
raw_test_dataset = pd.read_csv(test_url)


train = raw_train_dataset.copy()
test = raw_test_dataset.copy()

删除一些列,我在这里可能是错的

Dropping some columns , I may be wrong here

train = train.drop(['Cabin','Name','Ticket'], 1)
test = test.drop(['Cabin','Name','Ticket'], 1)

热载体

train = pd.get_dummies(train, prefix='', prefix_sep='')
test = pd.get_dummies(test, prefix='', prefix_sep='')

培训标签

train_predict = train.pop('Survived')

用均值填充空年龄

train['Age'].fillna((train['Age'].mean()), inplace=True)
test['Age'].fillna((train['Age'].mean()), inplace=True)

删除空列

test = test.dropna()
train = train.dropna()

创建归一化层

normalizer = preprocessing.Normalization()
normalizer.adapt(np.array(train))

创建dnn,我在这里错了吗

Creating dnn , am I wrong here

model = keras.Sequential([
      normalizer,
      layers.Dense(64, activation='relu'),
      layers.Dropout(0.2),
      layers.Dense(1)
  ])



model.compile(loss=losses.BinaryCrossentropy(from_logits=True),
              optimizer='adam',
              metrics=tf.metrics.BinaryAccuracy(threshold=0.0))


history = model.fit(
    train, train_predict,
    validation_split=0.2,
     epochs=30)

在每种情况下都显示1,但是训练时我仍然可以达到85%的准确度,我不需要完全解决问题(我想自己尝试一下),而只需要解决我遇到的问题

This shows 1 in every case but I still get accuracy of 85% when training , I don't need complete solution of the problem(I want to try on my own) but just the part where I am stuck

result = tf.nn.softmax(model(train))
print(result)

推荐答案

tf.nn.softmax将始终返回sum=1的数组.由于您的输出为1值(最终/输出层上有一个单位),因此softmax操作会将其转换为1.

tf.nn.softmax will always return an array of sum=1. Since your output is 1 value (you have one unit on your final/output layer), a softmax operation will transform this value to 1.

for value in [.2, .999, .0001, 100., -100.]:
    print(tf.nn.softmax([value]))

tf.Tensor([1.], shape=(1,), dtype=float32)
tf.Tensor([1.], shape=(1,), dtype=float32)
tf.Tensor([1.], shape=(1,), dtype=float32)
tf.Tensor([1.], shape=(1,), dtype=float32)
tf.Tensor([1.], shape=(1,), dtype=float32)

您要查找的是tf.nn.sigmoid:

for value in [.2, .999, .0001, 100., -100.]:
    print(tf.nn.sigmoid([value]))

tf.Tensor([0.549834], shape=(1,), dtype=float32)
tf.Tensor([0.7308619], shape=(1,), dtype=float32)
tf.Tensor([0.500025], shape=(1,), dtype=float32)
tf.Tensor([1.], shape=(1,), dtype=float32)
tf.Tensor([0.], shape=(1,), dtype=float32)

losses.BinaryCrossentropy(from_logits=True)就像S型交叉熵.

如果要将数值四舍五入为0或1,请使用tf.round:

If you want to round the values to get 0 or 1, use tf.round:

tf.round(tf.nn.sigmoid([.1]))

这篇关于带有softmax激活的二进制分类始终输出1的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆