7. Eigenvalues and Eigenvectors: Properties
Michael Friendly
2024-10-05
Source:vignettes/a7-eigen-ex1.Rmd
a7-eigen-ex1.Rmd
Setup
This vignette uses an example of a matrix to illustrate some properties of eigenvalues and eigenvectors. We could consider this to be the variance-covariance matrix of three variables, but the main thing is that the matrix is square and symmetric, which guarantees that the eigenvalues, are real numbers. Covariance matrices are also positive semi-definite, meaning that their eigenvalues are non-negative, .
## [,1] [,2] [,3]
## [1,] 13 -4 2
## [2,] -4 11 -2
## [3,] 2 -2 8
Get the eigenvalues and eigenvectors using eigen()
; this
returns a named list, with eigenvalues named values
and
eigenvectors named vectors
.
ev <- eigen(A)
# extract components
(values <- ev$values)
## [1] 17 8 7
(vectors <- ev$vectors)
## [,1] [,2] [,3]
## [1,] 0.7454 0.6667 0.0000
## [2,] -0.5963 0.6667 0.4472
## [3,] 0.2981 -0.3333 0.8944
The eigenvalues are always returned in decreasing order, and each
column of vectors
corresponds to the elements in
values
.
Properties of eigenvalues and eigenvectors
The following steps illustrate the main properties of eigenvalues and eigenvectors. We use the notation to express the decomposition of the matrix , where is the matrix of eigenvectors and is the diagonal matrix composed of the ordered eigenvalues, .
- Orthogonality: Eigenvectors are always orthogonal,
.
zapsmall()
is handy for cleaning up tiny values.
crossprod(vectors)
## [,1] [,2] [,3]
## [1,] 1.000e+00 3.053e-16 5.551e-17
## [2,] 3.053e-16 1.000e+00 0.000e+00
## [3,] 5.551e-17 0.000e+00 1.000e+00
## [,1] [,2] [,3]
## [1,] 1 0 0
## [2,] 0 1 0
## [3,] 0 0 1
- trace(A) = sum of eigenvalues, .
## [1] 32
sum(values)
## [1] 32
- sum of squares of A = sum of squares of eigenvalues, .
sum(A^2)
## [1] 402
sum(values^2)
## [1] 402
- determinant = product of eigenvalues, . This means that the determinant will be zero if any .
det(A)
## [1] 952
prod(values)
## [1] 952
- rank = number of non-zero eigenvalues
R(A)
## [1] 3
sum(values != 0)
## [1] 3
- eigenvalues of the inverse = 1/eigenvalues of A. The eigenvectors are the same, except for order, because eigenvalues are returned in decreasing order.
AI <- solve(A)
AI
## [,1] [,2] [,3]
## [1,] 0.08824 0.02941 -0.01471
## [2,] 0.02941 0.10504 0.01891
## [3,] -0.01471 0.01891 0.13340
eigen(AI)$values
## [1] 0.14286 0.12500 0.05882
eigen(AI)$vectors
## [,1] [,2] [,3]
## [1,] 0.0000 0.6667 0.7454
## [2,] 0.4472 0.6667 -0.5963
## [3,] 0.8944 -0.3333 0.2981
- There are similar relations for other powers of a matrix,
:
values(mpower(A,p)) = values(A)^p
, wherempower(A,2) = A %*% A
, etc.
## eigen() decomposition
## $values
## [1] 289 64 49
##
## $vectors
## [,1] [,2] [,3]
## [1,] 0.7454 0.6667 0.0000
## [2,] -0.5963 0.6667 0.4472
## [3,] 0.2981 -0.3333 0.8944
## [1] 4913 512 343
## [1] 83521 4096 2401