喀拉拉邦在每个时代都遭受同样的损失 [英] keras giving same loss on every epoch

查看:49
本文介绍了喀拉拉邦在每个时代都遭受同样的损失的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我是keras的新手.

I am newbie to keras.

我在一个旨在减少对数损失的数据集上运行了它. 对于每个时期,它给我相同的损失值.无论我是否走上正轨,我都很困惑.

I ran it on a dataset where my objective was to reduce the logloss. For every epoch it is giving me the same loss value. I am confused whether i am on the right track or not.

例如:

Epoch 1/5
91456/91456 [==============================] - 142s - loss: 3.8019 - val_loss: 3.8278
Epoch 2/5
91456/91456 [==============================] - 139s - loss: 3.8019 - val_loss: 3.8278
Epoch 3/5
91456/91456 [==============================] - 143s - loss: 3.8019 - val_loss: 3.8278
Epoch 4/5
91456/91456 [==============================] - 142s - loss: 3.8019 - val_loss: 3.8278
Epoch 5/5
91456/91456 [==============================] - 142s - loss: 3.8019 - val_loss: 3.8278

在每个时代3.8019都是相同的.应该会更少.

Here 3.8019 is same in every epoch. It is supposed to be less.

推荐答案

我也遇到了这个问题.经过深思熟虑,我发现这是我在输出层上的激活功能.

I ran into this issue as well. After much deliberation, I figured out that it was my activation function on my output layer.

我有这个模型来预测二进制结果:

I had this model to predict a binary outcome:

model = Sequential()
model.add(Dense(16,input_shape=(8,),activation='relu'))
model.add(Dense(32,activation='relu'))
model.add(Dense(32,activation='relu'))
model.add(Dense(1, activation='softmax'))

我需要这个来实现二进制交叉熵

and I needed this for binary cross entropy

model = Sequential()
model.add(Dense(16,input_shape=(8,),activation='relu'))
model.add(Dense(32,activation='relu'))
model.add(Dense(32,activation='relu'))
model.add(Dense(1, activation='sigmoid'))

我会考虑您要解决的问题以及确保您的激活功能符合要求的输出.

I would look towards the problem you are trying to solve and the output needed to ensure that your activation functions are what they need to be.

这篇关于喀拉拉邦在每个时代都遭受同样的损失的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆