如何获得特征权重 [英] How to obtain features' weights

查看:252
本文介绍了如何获得特征权重的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在处理高度不平衡的数据集,我的想法是从我的 libSVM 模型中获取特征权重的值.就目前而言,我对线性内核还可以,我可以在其中获得特征权重,但是当我使用rbfpoly时,我无法达到目标.

I am dealing with highly imbalanced data set and my idea is to obtain values of feature weights from my libSVM model. As for now I am OK with the linear kernel, where I can obtain feature weights, but when I am using rbf or poly, I fail to reach my objective.

在这里,我将sklearn用于我的模型,使用.coef_可以很容易地获得线性核的特征权重.有人可以帮我为rbfpoly做同样的事情吗?到目前为止,我已经尝试做的事情如下:

Here I am using sklearn for my model and it's easy to obtain feature weights for linear kernel using .coef_. Can anyone help me to do same thing for rbf or poly? What I've tried to do so far is given below:

svr = SVC(C=10, cache_size=200, class_weight='auto', coef0=0.0, degree=3.0, gamma=0.12,kernel='rbf', max_iter=-1, probability=True, random_state=0,shrinking=True, tol=0.001, verbose=False)
clf = svr.fit(data_train,target_train)
print clf.coef_

推荐答案

这不仅是不可能的,如

This is not only impossible, as stated in the documentation:

权重分配给要素(原始问题中的系数).仅在线性核的情况下可用.

Weights asigned to the features (coefficients in the primal problem). This is only available in the case of linear kernel.

但这也是没有意义的.在线性SVM中,最终的分离平面与您的输入要素位于相同的空间中.因此,可以将其系数视为输入的维度"的权重.

but also it doesn't make sense. In linear SVM the resulting separating plane is in the same space as your input features. Therefore its coefficients can be viewed as weights of the input's "dimensions".

在其他内核中,分隔平面存在于另一个空间中-这是原始空间的内核转换的结果.它的系数与输入空间不直接相关.实际上,对于rbf内核,转换后的空间是无限维的(您可以在

In other kernels, the separating plane exists in another space - a result of kernel transformation of the original space. Its coefficients are not directly related to the input space. In fact, for the rbf kernel the transformed space is infinite-dimensional (you can get a starting point on this on Wikipedia of course).

这篇关于如何获得特征权重的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆