使用熵的加权决策树 [英] Weighted Decision Trees using Entropy

查看:168
本文介绍了使用熵的加权决策树的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在使用互信息增益作为拆分功能来构建二进制分类树.但是,由于训练数据偏向几个班级,因此建议按每个班级的倒数对每个训练示例加权.

I'm building a binary classification tree using mutual information gain as the splitting function. But since the training data is skewed toward a few classes, it is advisable to weight each training example by the inverse class frequency.

我如何加权训练数据?在计算估计熵的概率时,我是否采用加权平均值?

How do I weight the training data? When calculating the probabilities to estimate the entropy, do I take weighted averages?

我想要一个带有权重的熵表达式.

I'd like an expression for entropy with the weights.

推荐答案

状态值加权熵作为投资风险的度量.
http://www56.homepage.villanova. edu/david.nawrocki/State%20Weighted%20Entropy%20Nawrocki%20Harding.pdf

这篇关于使用熵的加权决策树的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆