这是将训练集投影到房间中的正确方法吗? MATLAB [英] Is this the right way of projecting the training set into the eigespace? MATLAB

查看:308
本文介绍了这是将训练集投影到房间中的正确方法吗? MATLAB的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我已使用以下计算PCA:

  function [signals,V] = pca2(data)
[M,N] = size(data);
data = reshape(data,M * N,1);
%减去每个维的平均值
mn = mean(data,2);
data = bsxfun(@minus,data,mean(data,1));
%构造矩阵Y
Y = data'* data /(M * N-1);
[V D] = eigs(Y,10); %reduce to 10 dimension
%project原始数据
signals = data * V;

我的问题是:



信号是训练集到本征空间的投影。



在Amir Hossein代码中看到,中心图像向量代码需要通过在本征空间基础中相乘来投影到面部空间中。我不太明白为什么使用居中的图像向量完成投影?是不是信号足够分类?

解决方案

通过信号,我假设你的意思是问我们为什么要从原始矢量形式的图像中减去平均值。



如果你想到PCA,它试图给你最好的方向数据变化最大。然而,因为你的图像包含像素可能只是正值那些像素将永远是积极的,这将误导,特别是你的第一和最重要的特征向量。您可以搜索更多关于第二矩矩阵。但我会分享一个坏的油漆图像解释它。



请忽略星星的大小;



星星:您的资料



红线:特征向量;



您可以在2D中轻松看到,中心数据可以为您的主成分。如果您跳过此步骤,您的第一个特征向量将偏向平均值并导致较差的结果。




I have computed PCA using the following :

function [signals,V] = pca2(data) 
[M,N] = size(data); 
data = reshape(data, M*N,1);
% subtract off the mean for each dimension 
mn = mean(data,2); 
data = bsxfun(@minus, data, mean(data,1));     
% construct the matrix Y 
Y = data'*data / (M*N-1); 
[V D] = eigs(Y, 10);   % reduce to 10 dimension
% project the original data 
signals = data * V;

My question is:

Is "signals" is the projection of the training set into the eigenspace?

I saw in "Amir Hossein" code that "centered image vectors" that is "data" in the above code needs to be projected into the "facespace" by multiplying in the eigenspace basis's. I don't really understand why is the projection done using centered image vectors? Isn't "signals" enough for classification??

解决方案

By signals, I assume you mean to ask why are we subtracting the mean from raw vector form of image.

If you think about PCA; it is trying to give you best direction where the data varies most. However, as your images contain pixel probably only positive values those pixels will always be on positive which will mislead, especially, your first and most important eigenvector. You can search more about second moment matrix. But I will share a bad paint image that explains it. Sorry about my drawing.

Please ignore the size of stars;

Stars: Your data

Red Line: Eigenvectors;

As you can easily see in 2D, centering the data can give better direction for your principal component. If you skip this step, your first eigenvector will bias on mean and cause poorer results.

这篇关于这是将训练集投影到房间中的正确方法吗? MATLAB的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆