numPy中的意外特征向量 [英] Unexpected eigenvectors in numPy

查看:100
本文介绍了numPy中的意外特征向量的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我看到了这个问题,它与我的尝试有关用numPy计算Python中的主导特征向量.

I have seen this question, and it is relevant to my attempt to compute the dominant eigenvector in Python with numPy.

我试图计算n x n矩阵的主导特征向量,而不必陷入太多繁重的线性代数.我对行列式,特征值,特征向量和特征多项式进行了粗略的研究,但我宁愿依靠numPy实现来查找特征值,因为我相信它比我自己的方法更有效.

I am trying to compute the dominant eigenvector of an n x n matrix without having to get into too much heavy linear algebra. I did cursory research on determinants, eigenvalues, eigenvectors, and characteristic polynomials, but I would prefer to rely on the numPy implementation for finding eigenvalues as I believe it is more efficient than my own would be.

我遇到的问题是我使用了以下代码:

The problem I encountered was that I used this code:

    markov = array([[0.8,0.2],[.1,.9]])

    print eig(markov)

...作为测试,并获得以下输出:

...as a test, and got this output:

    (array([ 0.7,  1. ]), array([[-0.89442719, -0.70710678],
           [ 0.4472136 , -0.70710678]]))

令我担心的是,根据Perron-Frobenius定理,第二个特征向量的所有分量都应为正(根据Wikipedia的说法,具有正项的实方阵具有唯一的最大实特征值,而相应的特征向量具有严格的正分量".

What concerns me about this is that by the Perron-Frobenius theorem, all of the components of the second eigenvector should be positive (since, according to Wikipedia, "a real square matrix with positive entries has a unique largest real eigenvalue and that the corresponding eigenvector has strictly positive components").

有人知道这是怎么回事吗? numPy错误吗?我在ZFC中发现不一致之处吗?还是仅仅是我对线性代数,Python,numPy或这三者的某种组合感到陌生?

Anyone know what's going on here? Is numPy wrong? Have I found an inconsistency in ZFC? Or is it just me being a noob at linear algebra, Python, numPy, or some combination of the three?

感谢您可以提供的任何帮助.另外,这是我的第一个SO问题(尽管我以前在cstheory.se上很活跃),所以任何有关提高我的问题清晰度的建议也将不胜感激.

Thanks for any help that you can provide. Also, this is my first SO question (I used to be active on cstheory.se though), so any advice on improving the clarity of my question would be appreciated, too.

推荐答案

您只是误解了eig的返回.根据文档,第二个返回参数是

You are just misinterpreting eig's return. According to the docs, the second return argument is

归一化的(单位长度")特征向量,例如列 v [:,i]是对应于特征值w [i]的特征向量.

The normalized (unit "length") eigenvectors, such that the column v[:,i] is the eigenvector corresponding to the eigenvalue w[i].

因此对应于特征值1的特征向量不是[ 0.4472136 , -0.70710678],而是[-0.70710678, -0.70710678],可以很容易地验证:

So the eigenvector corresponding to eigenvalue 1 is not [ 0.4472136 , -0.70710678], but [-0.70710678, -0.70710678], as can be easily verified:

>>> markov.dot([ 0.4472136 , -0.70710678]) # not an eigenvector
array([ 0.21634952, -0.59167474])
>>> markov.dot([-0.70710678, -0.70710678]) # an eigenvector
array([-0.70710678, -0.70710678])

这篇关于numPy中的意外特征向量的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆