-
Eigendecomposition개인 공부/딥러닝 기초 개념 2023. 8. 26. 15:11
선형대수 많이 까먹어서.. 기억을 되짚어보기 위해 올립니다.
reference : Deep learning (Ian Goodfellow, Yoshua Bengio, Aaron Courville)Eigendecomposition
Eigendecomposition means that we decompose a matrix into a set of eigenvectors and eigenvalues.
An eigen vector of a square matrix A is a nonzero vector $v$ such that multipication by A alters only the scale of $v$
$Av = \lambda v$
the scalar $\lambda$ is known as eigenvalue corresponding to this eigenvector.
If $v$ is an eigenvector of $A$, then so is any rescaled vector $sv$ for $s \in \mathbb{R}, s \neq 0$
Moreover, $sv$ still has the same eigenvalue. For this reason, we usually only look for unit eigenvectors.
Suppose that a matrix $A$ has n linearly independent eigenvectors, $\{v^{(1)}, ..., v^{(n)}\}$, with corresponding eigenvalues $\{ {\lambda}_1, ..., {\lambda}_n \}$.
We may concatenate all of the eigenvectors to form a matrix $V$ with one eigenvector per column :$ V = [ v^{(1)}, ..., v^{(n)} ] $.
Likewise, we can concatenate the eigenvalues to form a vector $ \lambda = [{\lambda}_1, ..., {\lambda}_n ]^T $
The eigendecomposition of A is then given by
$ A = V diag( \lambda ) V^{-1} $
We have seen that constructing matrices with specific eigenvalues and eigenvectors allows us to stretch space in desired directions.
However, we often want to decompose matrices into their eigenvalues and eigenvectors.
Doing so can help us to analyze certain properties of the matrix, much as decomposing an integer into its prime factors can help us understand the behavior of that integer.
Not every matrix can be decomposed into eigenvaludes and eigenvectors.
In some cases, the decomposition exists, but may involve complex rather than real numbers.
Specifically, every real symmetric matrix can be decomposed into an expression using only real-valued eigenvectors and eigenvalues:
$A = Q \Lambda Q^T$
where $Q$ is an orthogonal matrix composed of eigenvectors of $A$, and $\Lambda$ is a diagonal matrix.
The eigenvalue $\Lambda_{i,i}$ is associated with the eigenvector in column $i$ of $Q$, denoted as $Q_{:,i}$.
Because $Q$ is an orthogonal marix, we can think of $A$ as scaling space by ${\lambda}_i$ in direction $v^(i)$.
'개인 공부 > 딥러닝 기초 개념' 카테고리의 다른 글
Representation Learning - greedy layer-wise unsupervised pretraining (0) 2023.09.03 Representation Learning - Introduction (0) 2023.08.26