Post Reply 
Eigenvector mystery
10-18-2020, 09:31 PM (This post was last modified: 10-18-2020 09:40 PM by John Keith.)
Post: #1
Eigenvector mystery
I was just looking at the documentation for the Julia programming language today and I noticed something interesting.

In the second gray box on this page there is an example showing the calculation of eigenvectors and eigenvalues. I tried the same matrix on the HP 50 in approximate mode and the result for the eigenvectors was substantially different.

Wondering which set of eigenvectors was correct, I checked with Wolfram Alpha, which gave a third, completely different result! Going back to the 50g I tried the same matrix in exact mode, and obtained yet another result different from the previous 3. The eigenvalues in all 4 cases were the same however. The matrix is

Code:

[[ -4  -17
 [  2    2]]

Is this matrix so ill-conditioned that its eigenvectors can't be computed accurately? It doesn't seem so, its determinant is 26.
Find all posts by this user
Quote this message in a reply
10-18-2020, 10:54 PM
Post: #2
RE: Eigenvector mystery
If I remember, there are an infinite number of eigenvectors, the most important are the eigenvalues.
Did you check if the eigenvalues are equal?

Thibault - not collector but in love with the few HP models I own - Also musician : http://walruspark.co
Find all posts by this user
Quote this message in a reply
10-18-2020, 11:58 PM (This post was last modified: 10-19-2020 01:32 AM by Albert Chan.)
Post: #3
RE: Eigenvector mystery
All eigenvectors are correct. The difference is only a scaling factor.

XCas> m := [[-4,-17], [2,2]]
XCas> eigenvalues(m)      → (-1-5*i,-1+5*i)

Solving both vectors at the same time, λ = -1±5i

m * [x,y] = λ * [x,y]
(m - λ) * [x,y] = 0

m - λ = \(\left(\begin{array}{cc}
-3∓5i & -17 \\
2 & 3∓5i
\end{array}\right) \)

If you pick 2nd row, you get results of Wolfram Alpha

2 x + (3∓5i) y = 0
2 x = (-3±5i) y       → [x, y] = [-3±5i, 2] * t, where t is the parameter

If you pick 1st row, you get [x,y] = [-17, 3±5i] * t'
Confirming numpy's result (which is probably what Julia use)

>>> import numpy
>>> k, v = numpy.linalg.eig([[-4,-17],[2,2]])
>>> print k
[-1.+5.j -1.-5.j]
>>> print v
[[ 0.94590530+0.j           0.94590530+0.j        ]
 [-0.16692447-0.27820744j  -0.16692447+0.27820744j]]
>>> print v * (-17/v[0][0])
[[-17.+0.j  -17.+0.j]
 [  3.+5.j    3.-5.j]]

Update: numpy returns normalized eigenvectors (length = 1)
>>> sum(abs(v)**2) ** 0.5       # numpy eigenvectors are column vectors
array([ 1., 1.])
Find all posts by this user
Quote this message in a reply
10-19-2020, 06:33 PM (This post was last modified: 10-19-2020 06:54 PM by JurgenRo.)
Post: #4
RE: Eigenvector mystery
(10-18-2020 10:54 PM)pinkman Wrote:  If I remember, there are an infinite number of eigenvectors, the most important are the eigenvalues.
Did you check if the eigenvalues are equal?
You should really reread this in a textbook Wink For a nxn Matrix there are at most n different EValues (which are numbers). EVectors (which are vectors) to different EValues are linear independent, so you end up with at most n linear independent EValues either. Eigenvalues (w) and eigenvectors (v) of a matrix (M) belong inseparably together and define each other: Mv=wv.

Edit: Typo, equation added
Find all posts by this user
Quote this message in a reply
10-19-2020, 07:13 PM
Post: #5
RE: Eigenvector mystery
(10-18-2020 10:54 PM)pinkman Wrote:  Eigenvalues (w) and eigenvectors (v) of a matrix (M) belong inseparably together and define each other: Mv=wv.

True, but so is M(kv) = w(kv)

For eigenvalue w, all kv are valid eigenvectors for it, as long as k ≠ 0
Find all posts by this user
Quote this message in a reply
10-19-2020, 07:19 PM
Post: #6
RE: Eigenvector mystery
(10-19-2020 07:13 PM)Albert Chan Wrote:  
(10-18-2020 10:54 PM)pinkman Wrote:  Eigenvalues (w) and eigenvectors (v) of a matrix (M) belong inseparably together and define each other: Mv=wv.

True, but so is M(kv) = w(kv)

For eigenvalue w, all kv are valid eigenvectors for it, as long as k ≠ 0

Also, eigenvalues with multiplicity greater than 1 correspond to subspaces with corresponding dimension, so in those cases the eigenvectors are not uniquely determined even if you normalize their lengths...
Visit this user's website Find all posts by this user
Quote this message in a reply
10-19-2020, 07:26 PM
Post: #7
RE: Eigenvector mystery
(10-19-2020 07:13 PM)Albert Chan Wrote:  
(10-18-2020 10:54 PM)pinkman Wrote:  Eigenvalues (w) and eigenvectors (v) of a matrix (M) belong inseparably together and define each other: Mv=wv.

True, but so is M(kv) = w(kv)

For eigenvalue w, all kv are valid eigenvectors for it, as long as k ≠ 0

Also true but not relevant Wink It's all about to pick a set of linear indpendent Eigenvectors for a certain Eigenvalue, i. e. a base for the corresponding Eigenspace.
Find all posts by this user
Quote this message in a reply
10-19-2020, 07:35 PM
Post: #8
RE: Eigenvector mystery
(10-19-2020 07:19 PM)Thomas Okken Wrote:  
(10-19-2020 07:13 PM)Albert Chan Wrote:  True, but so is M(kv) = w(kv)

For eigenvalue w, all kv are valid eigenvectors for it, as long as k ≠ 0

Also, eigenvalues with multiplicity greater than 1 correspond to subspaces with corresponding dimension, so in those cases the eigenvectors are not uniquely determined even if you normalize their lengths...

Uniqueness is not crucial. It's all about dimension of Eigenspaces (geometrical multiplicity) and to pick a finite set of linear independent vectors to form a base. That there are infinite vectors in a vectorspace is just part of the definition of a vectorspace (scalar multipication) and so it is irrelevant to mention it at all.
Find all posts by this user
Quote this message in a reply
10-19-2020, 08:57 PM
Post: #9
RE: Eigenvector mystery
(10-19-2020 06:33 PM)JurgenRo Wrote:  
(10-18-2020 10:54 PM)pinkman Wrote:  If I remember, there are an infinite number of eigenvectors, the most important are the eigenvalues.
Did you check if the eigenvalues are equal?
You should really reread this in a textbook Wink

Now I did!
What is still obscure for me is how the choice of eigenvalues are made by the algorithm.
Find all posts by this user
Quote this message in a reply
10-19-2020, 09:42 PM
Post: #10
RE: Eigenvector mystery
(10-19-2020 08:57 PM)pinkman Wrote:  
(10-19-2020 06:33 PM)JurgenRo Wrote:  You should really reread this in a textbook Wink

Now I did!
What is still obscure for me is how the choice of eigenvalues are made by the algorithm.

Eigenvalues are values λ such that Av = λv, so you find them by solving Av − λv = 0, or equivalently, (A − λI)v = 0, where I is the identity matrix, or equivalently, |A − λI| = 0. That last form is also known as the characteristic equation of A, and, being a polynomial of the same degree as the dimension of A, you can find its solutions, and thus the eigenvalues of A, using a polynomial root finder.
Visit this user's website Find all posts by this user
Quote this message in a reply
10-20-2020, 12:08 AM
Post: #11
RE: Eigenvector mystery
(10-19-2020 09:42 PM)Thomas Okken Wrote:  Eigenvalues are values λ such that Av = λv, so you find them by solving Av − λv = 0, or equivalently, (A − λI)v = 0, where I is the identity matrix, or equivalently, |A − λI| = 0. That last form is also known as the characteristic equation of A, and, being a polynomial of the same degree as the dimension of A, you can find its solutions, and thus the eigenvalues of A, using a polynomial root finder.

Thomas is fully right.

You can find a program to compute the coefficients of the Characteristic Polynomial (its roots are the eigenvalues) for real or complex matrices in my article:

      HP Article VA047 - Boldly Going - Eigenvalues and Friends

Also, there are many solved examples in the article, to make the matter crystal-clear.

V.

  
All My Articles & other Materials here:  Valentin Albillo's HP Collection
 
Visit this user's website Find all posts by this user
Quote this message in a reply
10-20-2020, 11:16 PM (This post was last modified: 10-20-2020 11:21 PM by Michael de Estrada.)
Post: #12
RE: Eigenvector mystery
The thing to understand is that the eigenvectors {x} are the solution to the series of simultaneous equations [A]{x} = {B}, where [A] is a square matrix with a zero determinant, such that there are an infinite number of solutions. The eigenvalues are those values in the matrix [A] that result in the determinant being zero. The diagonal coefficients of [A] are modified by subtracting the unknown eigenvalue and solving for those values that result in a zero determinant. The eigenvectors can then be determined by substituting each eigenvalue, and solving for {x} in the homogeneous set of equations [A]{x} = {0}. As has been mentioned before, although there are an infinite number of solutions, they are all the same function shape, differing only by a scale or normalizing factor. This is why eigenvectors are commonly referred to as modeshapes when used to solve problems in vibration theory. They are just as important as eigenvalues when solving problems involving structural vibrations, as they permit the calculation of the distribution of inertial forces acting on a vibrating structure. In fact, the eigenvalues, the square root of which are referred to as natural frequencies, are merely an intermediate step in the computation of the eigenvectors.
Find all posts by this user
Quote this message in a reply
10-21-2020, 06:57 PM
Post: #13
RE: Eigenvector mystery
(10-20-2020 12:08 AM)Valentin Albillo Wrote:  
(10-19-2020 09:42 PM)Thomas Okken Wrote:  Eigenvalues are values λ such that Av = λv, so you find them by solving Av − λv = 0, or equivalently, (A − λI)v = 0, where I is the identity matrix, or equivalently, |A − λI| = 0. That last form is also known as the characteristic equation of A, and, being a polynomial of the same degree as the dimension of A, you can find its solutions, and thus the eigenvalues of A, using a polynomial root finder.

Thomas is fully right.

You can find a program to compute the coefficients of the Characteristic Polynomial (its roots are the eigenvalues) for real or complex matrices in my article:

      HP Article VA047 - Boldly Going - Eigenvalues and Friends

Also, there are many solved examples in the article, to make the matter crystal-clear.

V.
Well, a few things should still be mentioned here:
1. for the characteristic polynomial p(x)=|A-xI|, the term |A-xI| denotes the determinant of (A-xI).
2. the algebraic multiplicity of an eigenvalue (i.e. the multiplicity as zero of p) is always >= geometric multiplicity (i.e. the dimension of the eigenspace belonging to the eigenvalue).
3. I am not sure if "Dimension of a Matrix" is a valid definition. It's simply the number of rows (or columns).
Find all posts by this user
Quote this message in a reply
10-21-2020, 06:58 PM (This post was last modified: 10-21-2020 07:04 PM by JurgenRo.)
Post: #14
RE: Eigenvector mystery
(10-21-2020 06:57 PM)JurgenRo Wrote:  
(10-20-2020 12:08 AM)Valentin Albillo Wrote:  Thomas is fully right.

You can find a program to compute the coefficients of the Characteristic Polynomial (its roots are the eigenvalues) for real or complex matrices in my article:

      HP Article VA047 - Boldly Going - Eigenvalues and Friends

Also, there are many solved examples in the article, to make the matter crystal-clear.

V.
Well, a few things should still be mentioned here (post by Thomas):
1. for the characteristic polynomial p(x)=|A-xI|, the term |A-xI| denotes the determinant of (A-xI).
2. the algebraic multiplicity of an eigenvalue (i.e. the multiplicity as zero of p) is always >= geometric multiplicity (i.e. the dimension of the eigenspace belonging to the eigenvalue).
3. I am not sure if "Dimension of a Matrix" is a valid definition. It's simply the number of rows (or columns).
Edit: Addendum
Edit2: "Zero" should be "root"
Find all posts by this user
Quote this message in a reply
10-24-2020, 02:03 PM
Post: #15
RE: Eigenvector mystery
Thanks everyone for the enlightening explanations, though most of the material is above my level of mathematical knowledge. Sad

I was reading Valentin's excellent article "Boldly Going - Eigenvalues and Friends" and that led me to another somewhat off-topic question.

The HP 50g command PCAR (characteristic polynomial) returns a polynomial in symbolic form, e.g.

'X^5-19*X^4+79*X^3+146*X^2-1153*X+1222'

Is there a command to convert this symbolic polynomial form to an array as would be returned by Valentin's PCHAR program? Related commands such as PROOT and PCOEF require their arguments as an array.

One could write a program to do the conversion by string processing but it would be messy.
Find all posts by this user
Quote this message in a reply
10-26-2020, 04:25 PM
Post: #16
RE: Eigenvector mystery
(10-24-2020 02:03 PM)John Keith Wrote:  'X^5-19*X^4+79*X^3+146*X^2-1153*X+1222'

Is there a command to convert this symbolic polynomial form to an array

XCas has symy2poly, but I can't find one for HP50G

XCas> symb2poly((x+1)*(x+2)*(x+3))       → poly1[1, 6, 11, 6]

Code:
≪ 0 OVER DEGREE → D
   ≪ 3. DUP D + FOR k HORNER k ROLLD NEXT
      + + D 1 + →ARRY
   ≫

Save above to, say, PARRY

hp50g> '(X+1)*(X+2)*(X+3)'
hp50g> PARRY        → [1 6 11 6]
hp50g> PROOT       → [-1. -2. -3.]

hp50g> 'X^5-19*X^4+79*X^3+146*X^2-1153*X+1222'
hp50g> PARRY        → [1 -19 79 146 -1153 1222]
Find all posts by this user
Quote this message in a reply
10-27-2020, 12:37 PM
Post: #17
RE: Eigenvector mystery
Thanks Albert, that works great! Interestingly, executing PROOT on the array returns the same roots as EGVL but in different order.
Find all posts by this user
Quote this message in a reply
Post Reply 




User(s) browsing this thread: 5 Guest(s)