8. Eigenvalues: Spectral Decomposition
Michael Friendly
2024-10-05
Source:vignettes/a8-eigen-ex2.Rmd
a8-eigen-ex2.Rmd
Setup
This vignette uses an example of a matrix to illustrate some properties of eigenvalues and eigenvectors. We could consider this to be the variance-covariance matrix of three variables, but the main thing is that the matrix is square and symmetric, which guarantees that the eigenvalues, are real numbers, and non-negative, .
## [,1] [,2] [,3]
## [1,] 13 -4 2
## [2,] -4 11 -2
## [3,] 2 -2 8
Get the eigenvalues and eigenvectors using eigen()
; this
returns a named list, with eigenvalues named values
and
eigenvectors named vectors
. We call these L
and V
here, but in formulas they correspond to a diagonal
matrix,
,
and a (orthogonal) matrix
.
ev <- eigen(A)
# extract components
(L <- ev$values)
## [1] 17 8 7
(V <- ev$vectors)
## [,1] [,2] [,3]
## [1,] 0.7454 0.6667 0.0000
## [2,] -0.5963 0.6667 0.4472
## [3,] 0.2981 -0.3333 0.8944
Matrix factorization
- Factorization of A: A = V diag(L) V’. That is, the matrix can be represented as the product .
## [,1] [,2] [,3]
## [1,] 13 -4 2
## [2,] -4 11 -2
## [3,] 2 -2 8
- V diagonalizes A: L = V’ A V. That is, the matrix transforms into the diagonal matrix , corresponding to orthogonal (uncorrelated) variables.
diag(L)
## [,1] [,2] [,3]
## [1,] 17 0 0
## [2,] 0 8 0
## [3,] 0 0 7
## [,1] [,2] [,3]
## [1,] 17 0 0
## [2,] 0 8 0
## [3,] 0 0 7
Spectral decomposition
The basic idea here is that each eigenvalue–eigenvector pair generates a rank 1 matrix, , and these sum to the original matrix, .
## [,1] [,2] [,3]
## [1,] 9.444 -7.556 3.778
## [2,] -7.556 6.044 -3.022
## [3,] 3.778 -3.022 1.511
## [,1] [,2] [,3]
## [1,] 3.556 3.556 -1.7778
## [2,] 3.556 3.556 -1.7778
## [3,] -1.778 -1.778 0.8889
## [,1] [,2] [,3]
## [1,] 0 0.0 0.0
## [2,] 0 1.4 2.8
## [3,] 0 2.8 5.6
Then, summing them gives A
, so they do decompose
A
:
A1 + A2 + A3
## [,1] [,2] [,3]
## [1,] 13 -4 2
## [2,] -4 11 -2
## [3,] 2 -2 8
all.equal(A, A1+A2+A3)
## [1] TRUE
Further properties
- Sum of squares of A = sum of sum of squares of A1, A2, A3
sum(A^2)
## [1] 402
## [1] 289 64 49
## [1] 402
## [1] 402
- Each squared eigenvalue gives the sum of squares accounted for by the latent vector
L^2
## [1] 289 64 49
cumsum(L^2) # cumulative
## [1] 289 353 402
- The first
eigenvalues and vectors give a rank
approximation to
A
R(A1)
## [1] 1
R(A1 + A2)
## [1] 2
R(A1 + A2 + A3)
## [1] 3
# two dimensions
sum((A1+A2)^2)
## [1] 353
## [1] 0.8781