keras 结合两个损失与可调权重 [英] keras combining two losses with adjustable weights

查看:27
本文介绍了keras 结合两个损失与可调权重的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

这里是详细说明.我有一个两层的 keras 功能模型,输出 x1 和 x2.

So here is the detail description. I have a keras functional model with two layers with outputs x1 and x2.

x1 = Dense(1,activation='relu')(prev_inp1)

x2 = Dense(2,activation='relu')(prev_inp2)

我需要使用这些 x1 和 x2,合并/添加它们并像附加图像中那样提出加权损失函数.将相同的损失"传播到两个分支中.Alpha 可以灵活地随着迭代而变化

I need to use these x1 and x2, Merge/add Them and come up with weighted loss function like in the attached image. Propagate the 'same loss' into both branches. Alpha is flexible to vary with iterations

推荐答案

似乎将相同的损失"传播到两个分支中不会生效,除非 alpha 依赖于两个分支.如果 alpha 不是取决于两个分支的变量,那么部分损失将只是一个分支的常数.

It seems that propagating the "same loss" into both branches will not take effect, unless alpha is dependent on both branches. If alpha is not variable depending on both branches, then part of the loss will be just constant to one branch.

因此,在这种情况下,只需将两个损失分开编译模型并将权重添加到编译方法中:

So, in this case, just compile the model with the two losses separate and add the weights to the compile method:

model.compile(optmizer='someOptimizer',loss=[loss1,loss2],loss_weights=[alpha,1-alpha])

当您需要更改 alpha 时再次编译.

Compile again when you need alpha to change.

但如果确实 alpha 依赖于两个分支,那么您需要连接结果并计算 alpha 的值:

But if indeed alpha is dependent on both branches, then you need to concatenate the results and calculate alpha's value:

singleOut = Concatenate()([x1,x2])

还有一个自定义的损失函数:

And a custom loss function:

def weightedLoss(yTrue,yPred):
    x1True = yTrue[0]
    x2True = yTrue[1:]

    x1Pred = yPred[0]
    x2Pred = yPred[1:]

    #calculate alpha somehow with keras backend functions

    return (alpha*(someLoss(x1True,x1Pred)) + ((1-alpha)*(someLoss(x2True,x2Pred))

用这个函数编译:

model.compile(loss=weightedLoss, optimizer=....)

这篇关于keras 结合两个损失与可调权重的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆