执行多维缩放后如何获得特征值? [英] How to obtain the eigenvalues after performing Multidimensional scaling?

查看:50
本文介绍了执行多维缩放后如何获得特征值?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有兴趣在执行多维缩放后查看特征值.什么功能可以做到这一点?我查看了 文档,但它根本没有提到特征值.

这是一个代码示例:

<块引用>

mds = manifest.MDS(n_components=100, max_iter=3000, eps=1e-9,random_state=seed, dissimilarity="precomputed", n_jobs=1)结果 = mds.fit(wordDissimilarityMatrix)# 需要一种获取特征值的方法

解决方案

我也无法通过阅读文档找到它.我怀疑他们不是在执行 经典 MDS,而是更复杂的东西:

<块引用>

现代多维标度——理论与应用" Borg, I.;Groenen P. Springer 统计学系列 (1997)

非度量多维标度:数值方法" Kruskal, J. Psychometrika, 29 (1964)

通过优化非度量假设的拟合优度进行多维缩放" Kruskal, J. Psychometrika, 29, (1964)

如果您正在寻找每个经典 MDS 的特征值,那么自己获得它们并不难.步骤是:

  1. 获取距离矩阵.然后平方.
  2. 执行双居中.
  3. 查找特征值和特征向量
  4. 选择前 k 个特征值.
  5. 你的第 i 个主成分是 sqrt(eigenvalue_i)*eigenvector_i

代码示例见下文:

import numpy.linalg as la将熊猫导入为 pd# 得到一些距离矩阵df = pd.read_csv(http://rosetta.reltech.org/TC/v15/Mapping/data/dist-Aus.csv")A = df.values.T[1:].astype(float)# 平方A = A**2# 中心矩阵n = A.shape[0]J_c = 1./n*(np.eye(n) - 1 + (n-1)*np.eye(n))# 执行双重居中B = -0.5*(J_c.dot(A)).dot(J_c)# 找到特征值和特征向量eigen_val = la.eig(B)[0]eigen_vec = la.eig(B)[1].T# 选择前 2 个维度(例如)PC1 = np.sqrt(eigen_val[0])*eigen_vec[0]PC2 = np.sqrt(eigen_val[1])*eigen_vec[1]

I am interested in taking a look at the Eigenvalues after performing Multidimensional scaling. What function can do that ? I looked at the documentation, but it does not mention Eigenvalues at all.

Here is a code sample:

mds = manifold.MDS(n_components=100, max_iter=3000, eps=1e-9,
                   random_state=seed, dissimilarity="precomputed", n_jobs=1)
results = mds.fit(wordDissimilarityMatrix)
# need a way to get the Eigenvalues

解决方案

I also couldn't find it from reading the documentation. I suspect they aren't performing classical MDS, but something more sophisticated:

"Modern Multidimensional Scaling - Theory and Applications" Borg, I.; Groenen P. Springer Series in Statistics (1997)

"Nonmetric multidimensional scaling: a numerical method" Kruskal, J. Psychometrika, 29 (1964)

"Multidimensional scaling by optimizing goodness of fit to a nonmetric hypothesis" Kruskal, J. Psychometrika, 29, (1964)

If you're looking for eigenvalues per classical MDS then it's not hard to get them yourself. The steps are:

  1. Get your distance matrix. Then square it.
  2. Perform double-centering.
  3. Find eigenvalues and eigenvectors
  4. Select top k eigenvalues.
  5. Your ith principle component is sqrt(eigenvalue_i)*eigenvector_i

See below for code example:

import numpy.linalg as la
import pandas as pd

# get some distance matrix
df = pd.read_csv("http://rosetta.reltech.org/TC/v15/Mapping/data/dist-Aus.csv")
A = df.values.T[1:].astype(float)
# square it
A = A**2

# centering matrix
n = A.shape[0]
J_c = 1./n*(np.eye(n) - 1 + (n-1)*np.eye(n))

# perform double centering
B = -0.5*(J_c.dot(A)).dot(J_c)

# find eigenvalues and eigenvectors
eigen_val = la.eig(B)[0]
eigen_vec = la.eig(B)[1].T

# select top 2 dimensions (for example)
PC1 = np.sqrt(eigen_val[0])*eigen_vec[0]
PC2 = np.sqrt(eigen_val[1])*eigen_vec[1]

这篇关于执行多维缩放后如何获得特征值?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆