重现LightGBM自定义损失函数以进行回归 [英] Reproduce LightGBM Custom Loss Function for Regression

查看:101
本文介绍了重现LightGBM自定义损失函数以进行回归的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想为LightGBM重现自定义丢失功能.这是我尝试过的:

I want to reproduce the custom loss function for LightGBM. This is what I tried:

lgb.train(params=params, train_set=dtrain, num_boost_round=num_round, fobj=default_mse_obj)

将default_mse_obj定义为:

With default_mse_obj being defined as:

residual = y_true - y_pred.get_label()
grad = -2.0*residual
hess = 2.0+(residual*0)
return grad, hess

但是,与定义的自定义损失函数相比,默认回归"目标的评估指标有所不同.我想知道,LightGBM用于回归"目标的默认函数是什么?

However, eval metrics are different for the default "regression" objective, compared to the custom loss function defined. I would like to know, what is the default function used by LightGBM for the "regression" objective?

推荐答案

如您所见这里,这是回归任务的默认损失函数

as you can see here, this is the default loss function for regression task

def default_mse_obj(y_pred, dtrain):

    y_true = dtrain.get_label()

    grad = (y_pred - y_true)
    hess = np.ones(len(grad))

    return grad, hess

这篇关于重现LightGBM自定义损失函数以进行回归的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆