使用pca对FastICA进行降维 [英] dimension reduction using pca for FastICA

查看:599
本文介绍了使用pca对FastICA进行降维的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试开发一种图像分类系统。我正在使用以下文章:

I am trying to develop a system for image classification. I am using following the article:

Dia Ab Al Nadi和Ayman M. Mansour博士的纹理分类独立分析(ICA)

在一段中它说:


鉴于上述纹理图像,通过上述方法学习独立组件。
上述纹理的(8 x 8)ICA基函数分别如图2所示。 PCA减小了尺寸,总共有40个功能。请注意,不同窗口大小的独立组件是不同的。

Given the above texture images, the Independent Components are learned by the method outlined above. The (8 x 8) ICA basis function for the above textures are shown in Figure 2. respectively. The dimension is reduced by PCA, resulting in a total of 40 functions. Note that independent components from different windows size are different.

上面概述的方法是FastICA,纹理取自 Brodatz专辑,每张纹理图片都有 640x640 像素。我的问题是:

The "method outlined above" is FastICA, the textures are taken from Brodatz album , each texture image has 640x640 pixels. My question is:

作者的意思是尺寸减少了PCA,导致总共40个函数。,我怎样才能使用这些函数matlab?

What the authors means with "The dimension is reduced by PCA, resulting in a total of 40 functions.", and how can I get that functions using matlab?

推荐答案

PCA(主成分分析)是用于找到高维(数据)空间的正交基础(想到坐标系)的方法。 PCA基础的轴按方差排序,即沿着第一个PCA轴,您的数据具有最大的方差,沿着第二个轴,第二个方差等,等等。

PCA (Principal Component Analysis) is a method for finding an orthogonal basis (think of a coordinate system) for a high-dimensional (data) space. The "axes" of the PCA basis are sorted by variance, i.e. along the first PCA "axis" your data has the largest variance, along the second "axis" the second largest variance, etc.

这被用于降维:假设你有1000维数据。然后你做一个PCA,将你的数据转换成PCA基础并扔掉除前20个维度之外的所有维度(仅举例)。如果您的数据遵循某种统计分布,那么20个PCA维度可能几乎与64个原始维度一样描述您的数据。有一些方法可以找到要使用的维数,但这超出了范围。

This is exploited for dimension reduction: Say you have 1000 dimensional data. Then you do a PCA, transform your data into the PCA basis and throw away all but the first 20 dimensions (just an example). If your data follows a certain statistical distribution, then chances are that the 20 PCA dimensions describe your data almost as well as the 64 original dimensions did. There are methods for finding the number of dimensions to use, but that is beyond scope here.

计算上,PCA相当于找到数据的协方差矩阵的特征分解,在Matlab中: [V,D] = eig(cov(MyData))

Computationally, PCA amounts to finding the Eigen-decomposition of your data's covariance matrix, in Matlab: [V,D] = eig(cov(MyData)).

请注意,如果你想要要使用这些概念,你应该做一些认真的阅读。关于如何使用PCA处理图像数据的经典文章是 Turk和Pentland的Eigenfaces 。它还以一种可以理解的方式提供了一些背景知识。

Note that if you want to work with these concepts you should do some serious reading. A classic article on what you can do with PCA on image data is Turk and Pentland's Eigenfaces. It also gives some background in an understandable way.

这篇关于使用pca对FastICA进行降维的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆